00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 1007 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3674 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.074 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.074 The recommended git tool is: git 00:00:00.075 using credential 00000000-0000-0000-0000-000000000002 00:00:00.076 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.098 Fetching changes from the remote Git repository 00:00:00.101 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.126 Using shallow fetch with depth 1 00:00:00.126 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.126 > git --version # timeout=10 00:00:00.158 > git --version # 'git version 2.39.2' 00:00:00.158 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.179 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.179 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.415 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.425 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.436 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.436 > git config core.sparsecheckout # timeout=10 00:00:03.445 > git read-tree -mu HEAD # timeout=10 00:00:03.457 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.475 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.475 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.572 [Pipeline] Start of Pipeline 00:00:03.583 [Pipeline] library 00:00:03.584 Loading library shm_lib@master 00:00:03.584 Library shm_lib@master is cached. Copying from home. 00:00:03.600 [Pipeline] node 00:00:03.620 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.621 [Pipeline] { 00:00:03.629 [Pipeline] catchError 00:00:03.630 [Pipeline] { 00:00:03.640 [Pipeline] wrap 00:00:03.648 [Pipeline] { 00:00:03.655 [Pipeline] stage 00:00:03.656 [Pipeline] { (Prologue) 00:00:03.859 [Pipeline] sh 00:00:04.139 + logger -p user.info -t JENKINS-CI 00:00:04.155 [Pipeline] echo 00:00:04.157 Node: WFP20 00:00:04.164 [Pipeline] sh 00:00:04.460 [Pipeline] setCustomBuildProperty 00:00:04.472 [Pipeline] echo 00:00:04.474 Cleanup processes 00:00:04.479 [Pipeline] sh 00:00:04.758 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.758 1537273 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.770 [Pipeline] sh 00:00:05.046 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.046 ++ grep -v 'sudo pgrep' 00:00:05.046 ++ awk '{print $1}' 00:00:05.046 + sudo kill -9 00:00:05.046 + true 00:00:05.061 [Pipeline] cleanWs 00:00:05.070 [WS-CLEANUP] Deleting project workspace... 00:00:05.070 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.076 [WS-CLEANUP] done 00:00:05.080 [Pipeline] setCustomBuildProperty 00:00:05.094 [Pipeline] sh 00:00:05.374 + sudo git config --global --replace-all safe.directory '*' 00:00:05.463 [Pipeline] httpRequest 00:00:06.320 [Pipeline] echo 00:00:06.321 Sorcerer 10.211.164.101 is alive 00:00:06.328 [Pipeline] retry 00:00:06.329 [Pipeline] { 00:00:06.338 [Pipeline] httpRequest 00:00:06.342 HttpMethod: GET 00:00:06.342 URL: http://10.211.164.101/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.342 Sending request to url: http://10.211.164.101/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.351 Response Code: HTTP/1.1 200 OK 00:00:06.352 Success: Status code 200 is in the accepted range: 200,404 00:00:06.352 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.818 [Pipeline] } 00:00:08.835 [Pipeline] // retry 00:00:08.843 [Pipeline] sh 00:00:09.131 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.187 [Pipeline] httpRequest 00:00:10.143 [Pipeline] echo 00:00:10.145 Sorcerer 10.211.164.101 is alive 00:00:10.156 [Pipeline] retry 00:00:10.158 [Pipeline] { 00:00:10.173 [Pipeline] httpRequest 00:00:10.178 HttpMethod: GET 00:00:10.179 URL: http://10.211.164.101/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:10.179 Sending request to url: http://10.211.164.101/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:10.192 Response Code: HTTP/1.1 200 OK 00:00:10.193 Success: Status code 200 is in the accepted range: 200,404 00:00:10.193 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:12.745 [Pipeline] } 00:01:12.763 [Pipeline] // retry 00:01:12.771 [Pipeline] sh 00:01:13.058 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:15.607 [Pipeline] sh 00:01:15.891 + git -C spdk log --oneline -n5 00:01:15.891 c13c99a5e test: Various fixes for Fedora40 00:01:15.891 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:15.891 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:15.891 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:15.891 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:15.909 [Pipeline] withCredentials 00:01:15.921 > git --version # timeout=10 00:01:15.932 > git --version # 'git version 2.39.2' 00:01:15.950 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:15.952 [Pipeline] { 00:01:15.958 [Pipeline] retry 00:01:15.960 [Pipeline] { 00:01:15.971 [Pipeline] sh 00:01:16.255 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:16.268 [Pipeline] } 00:01:16.287 [Pipeline] // retry 00:01:16.292 [Pipeline] } 00:01:16.310 [Pipeline] // withCredentials 00:01:16.319 [Pipeline] httpRequest 00:01:17.014 [Pipeline] echo 00:01:17.016 Sorcerer 10.211.164.101 is alive 00:01:17.026 [Pipeline] retry 00:01:17.028 [Pipeline] { 00:01:17.042 [Pipeline] httpRequest 00:01:17.046 HttpMethod: GET 00:01:17.047 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:17.047 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:17.049 Response Code: HTTP/1.1 200 OK 00:01:17.050 Success: Status code 200 is in the accepted range: 200,404 00:01:17.050 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:25.755 [Pipeline] } 00:01:25.773 [Pipeline] // retry 00:01:25.782 [Pipeline] sh 00:01:26.067 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:27.459 [Pipeline] sh 00:01:27.745 + git -C dpdk log --oneline -n5 00:01:27.745 caf0f5d395 version: 22.11.4 00:01:27.745 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:27.745 dc9c799c7d vhost: fix missing spinlock unlock 00:01:27.745 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:27.745 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:27.756 [Pipeline] } 00:01:27.772 [Pipeline] // stage 00:01:27.784 [Pipeline] stage 00:01:27.786 [Pipeline] { (Prepare) 00:01:27.813 [Pipeline] writeFile 00:01:27.830 [Pipeline] sh 00:01:28.116 + logger -p user.info -t JENKINS-CI 00:01:28.131 [Pipeline] sh 00:01:28.421 + logger -p user.info -t JENKINS-CI 00:01:28.433 [Pipeline] sh 00:01:28.719 + cat autorun-spdk.conf 00:01:28.719 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:28.719 SPDK_RUN_UBSAN=1 00:01:28.719 SPDK_TEST_FUZZER=1 00:01:28.719 SPDK_TEST_FUZZER_SHORT=1 00:01:28.719 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:28.719 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:28.727 RUN_NIGHTLY=1 00:01:28.732 [Pipeline] readFile 00:01:28.757 [Pipeline] withEnv 00:01:28.759 [Pipeline] { 00:01:28.775 [Pipeline] sh 00:01:29.066 + set -ex 00:01:29.066 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:29.066 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:29.066 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:29.066 ++ SPDK_RUN_UBSAN=1 00:01:29.066 ++ SPDK_TEST_FUZZER=1 00:01:29.066 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:29.066 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:29.066 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:29.066 ++ RUN_NIGHTLY=1 00:01:29.066 + case $SPDK_TEST_NVMF_NICS in 00:01:29.066 + DRIVERS= 00:01:29.066 + [[ -n '' ]] 00:01:29.066 + exit 0 00:01:29.076 [Pipeline] } 00:01:29.090 [Pipeline] // withEnv 00:01:29.096 [Pipeline] } 00:01:29.109 [Pipeline] // stage 00:01:29.118 [Pipeline] catchError 00:01:29.120 [Pipeline] { 00:01:29.135 [Pipeline] timeout 00:01:29.135 Timeout set to expire in 30 min 00:01:29.138 [Pipeline] { 00:01:29.155 [Pipeline] stage 00:01:29.158 [Pipeline] { (Tests) 00:01:29.176 [Pipeline] sh 00:01:29.462 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:29.462 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:29.462 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:29.462 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:29.462 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:29.462 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:29.462 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:29.462 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:29.462 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:29.462 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:29.462 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:29.462 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:29.462 + source /etc/os-release 00:01:29.462 ++ NAME='Fedora Linux' 00:01:29.463 ++ VERSION='39 (Cloud Edition)' 00:01:29.463 ++ ID=fedora 00:01:29.463 ++ VERSION_ID=39 00:01:29.463 ++ VERSION_CODENAME= 00:01:29.463 ++ PLATFORM_ID=platform:f39 00:01:29.463 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:29.463 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:29.463 ++ LOGO=fedora-logo-icon 00:01:29.463 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:29.463 ++ HOME_URL=https://fedoraproject.org/ 00:01:29.463 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:29.463 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:29.463 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:29.463 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:29.463 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:29.463 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:29.463 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:29.463 ++ SUPPORT_END=2024-11-12 00:01:29.463 ++ VARIANT='Cloud Edition' 00:01:29.463 ++ VARIANT_ID=cloud 00:01:29.463 + uname -a 00:01:29.463 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:29.463 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:31.999 Hugepages 00:01:31.999 node hugesize free / total 00:01:31.999 node0 1048576kB 0 / 0 00:01:31.999 node0 2048kB 0 / 0 00:01:31.999 node1 1048576kB 0 / 0 00:01:31.999 node1 2048kB 0 / 0 00:01:31.999 00:01:31.999 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:31.999 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:31.999 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:31.999 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:31.999 + rm -f /tmp/spdk-ld-path 00:01:31.999 + source autorun-spdk.conf 00:01:31.999 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:31.999 ++ SPDK_RUN_UBSAN=1 00:01:31.999 ++ SPDK_TEST_FUZZER=1 00:01:31.999 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:31.999 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:31.999 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:31.999 ++ RUN_NIGHTLY=1 00:01:31.999 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:31.999 + [[ -n '' ]] 00:01:31.999 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:31.999 + for M in /var/spdk/build-*-manifest.txt 00:01:31.999 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:31.999 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:31.999 + for M in /var/spdk/build-*-manifest.txt 00:01:31.999 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:31.999 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:31.999 + for M in /var/spdk/build-*-manifest.txt 00:01:31.999 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:31.999 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:31.999 ++ uname 00:01:31.999 + [[ Linux == \L\i\n\u\x ]] 00:01:31.999 + sudo dmesg -T 00:01:31.999 + sudo dmesg --clear 00:01:31.999 + dmesg_pid=1538193 00:01:31.999 + [[ Fedora Linux == FreeBSD ]] 00:01:31.999 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:31.999 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:31.999 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:31.999 + [[ -x /usr/src/fio-static/fio ]] 00:01:31.999 + export FIO_BIN=/usr/src/fio-static/fio 00:01:31.999 + FIO_BIN=/usr/src/fio-static/fio 00:01:31.999 + sudo dmesg -Tw 00:01:31.999 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:31.999 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:31.999 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:31.999 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:31.999 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:31.999 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:31.999 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:31.999 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:31.999 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:31.999 Test configuration: 00:01:31.999 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:31.999 SPDK_RUN_UBSAN=1 00:01:31.999 SPDK_TEST_FUZZER=1 00:01:31.999 SPDK_TEST_FUZZER_SHORT=1 00:01:31.999 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:31.999 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:31.999 RUN_NIGHTLY=1 07:26:42 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:31.999 07:26:42 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:31.999 07:26:42 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:31.999 07:26:42 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:31.999 07:26:42 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:31.999 07:26:42 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:31.999 07:26:42 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:31.999 07:26:42 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:31.999 07:26:42 -- paths/export.sh@5 -- $ export PATH 00:01:31.999 07:26:42 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:31.999 07:26:42 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:31.999 07:26:42 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:31.999 07:26:42 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732775202.XXXXXX 00:01:31.999 07:26:42 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732775202.CenGXH 00:01:31.999 07:26:42 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:31.999 07:26:42 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:01:31.999 07:26:42 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:31.999 07:26:42 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:31.999 07:26:42 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:31.999 07:26:42 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:31.999 07:26:42 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:31.999 07:26:42 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:31.999 07:26:42 -- common/autotest_common.sh@10 -- $ set +x 00:01:32.000 07:26:42 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:32.000 07:26:42 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:32.000 07:26:42 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:32.000 07:26:42 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:32.000 07:26:42 -- spdk/autobuild.sh@16 -- $ date -u 00:01:32.000 Thu Nov 28 06:26:42 AM UTC 2024 00:01:32.000 07:26:42 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:32.000 LTS-67-gc13c99a5e 00:01:32.000 07:26:42 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:32.000 07:26:42 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:32.000 07:26:42 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:32.000 07:26:42 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:32.000 07:26:42 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:32.000 07:26:42 -- common/autotest_common.sh@10 -- $ set +x 00:01:32.000 ************************************ 00:01:32.000 START TEST ubsan 00:01:32.000 ************************************ 00:01:32.000 07:26:42 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:32.000 using ubsan 00:01:32.000 00:01:32.000 real 0m0.000s 00:01:32.000 user 0m0.000s 00:01:32.000 sys 0m0.000s 00:01:32.000 07:26:42 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:32.000 07:26:42 -- common/autotest_common.sh@10 -- $ set +x 00:01:32.000 ************************************ 00:01:32.000 END TEST ubsan 00:01:32.000 ************************************ 00:01:32.000 07:26:42 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:32.000 07:26:42 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:32.000 07:26:42 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:32.000 07:26:42 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:32.000 07:26:42 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:32.000 07:26:42 -- common/autotest_common.sh@10 -- $ set +x 00:01:32.000 ************************************ 00:01:32.000 START TEST build_native_dpdk 00:01:32.000 ************************************ 00:01:32.000 07:26:42 -- common/autotest_common.sh@1114 -- $ _build_native_dpdk 00:01:32.000 07:26:42 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:32.000 07:26:42 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:32.000 07:26:42 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:32.000 07:26:42 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:32.000 07:26:42 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:32.000 07:26:42 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:32.000 07:26:42 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:32.000 07:26:42 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:32.000 07:26:42 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:32.000 07:26:42 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:32.000 07:26:42 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:32.000 07:26:42 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:32.000 07:26:42 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:32.000 07:26:42 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:32.000 07:26:42 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:32.260 07:26:42 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:32.261 07:26:42 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:32.261 07:26:42 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:32.261 07:26:42 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:32.261 caf0f5d395 version: 22.11.4 00:01:32.261 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:32.261 dc9c799c7d vhost: fix missing spinlock unlock 00:01:32.261 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:32.261 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:32.261 07:26:42 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:32.261 07:26:42 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:32.261 07:26:42 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:32.261 07:26:42 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:32.261 07:26:42 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:32.261 07:26:42 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:32.261 07:26:42 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:32.261 07:26:42 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:32.261 07:26:42 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:32.261 07:26:42 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:32.261 07:26:42 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:32.261 07:26:42 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:32.261 07:26:42 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:32.261 07:26:42 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:32.261 07:26:42 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:32.261 07:26:42 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:32.261 07:26:42 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:32.261 07:26:42 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:32.261 07:26:42 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:32.261 07:26:42 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:32.261 07:26:42 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:32.261 07:26:42 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:32.261 07:26:42 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:32.261 07:26:42 -- scripts/common.sh@343 -- $ case "$op" in 00:01:32.261 07:26:42 -- scripts/common.sh@344 -- $ : 1 00:01:32.261 07:26:42 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:32.261 07:26:42 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:32.261 07:26:42 -- scripts/common.sh@364 -- $ decimal 22 00:01:32.261 07:26:42 -- scripts/common.sh@352 -- $ local d=22 00:01:32.261 07:26:42 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:32.261 07:26:42 -- scripts/common.sh@354 -- $ echo 22 00:01:32.261 07:26:42 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:32.261 07:26:42 -- scripts/common.sh@365 -- $ decimal 21 00:01:32.261 07:26:42 -- scripts/common.sh@352 -- $ local d=21 00:01:32.261 07:26:42 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:32.261 07:26:42 -- scripts/common.sh@354 -- $ echo 21 00:01:32.261 07:26:42 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:32.261 07:26:42 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:32.261 07:26:42 -- scripts/common.sh@366 -- $ return 1 00:01:32.261 07:26:42 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:32.261 patching file config/rte_config.h 00:01:32.261 Hunk #1 succeeded at 60 (offset 1 line). 00:01:32.261 07:26:42 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:32.261 07:26:42 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:32.261 07:26:42 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:32.261 07:26:42 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:32.261 07:26:42 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:32.261 07:26:42 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:32.261 07:26:42 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:32.261 07:26:42 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:32.261 07:26:42 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:32.261 07:26:42 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:32.261 07:26:42 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:32.261 07:26:42 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:32.261 07:26:42 -- scripts/common.sh@343 -- $ case "$op" in 00:01:32.261 07:26:42 -- scripts/common.sh@344 -- $ : 1 00:01:32.261 07:26:42 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:32.261 07:26:42 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:32.261 07:26:42 -- scripts/common.sh@364 -- $ decimal 22 00:01:32.261 07:26:42 -- scripts/common.sh@352 -- $ local d=22 00:01:32.261 07:26:42 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:32.261 07:26:42 -- scripts/common.sh@354 -- $ echo 22 00:01:32.261 07:26:42 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:32.261 07:26:42 -- scripts/common.sh@365 -- $ decimal 24 00:01:32.261 07:26:42 -- scripts/common.sh@352 -- $ local d=24 00:01:32.261 07:26:42 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:32.261 07:26:42 -- scripts/common.sh@354 -- $ echo 24 00:01:32.261 07:26:42 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:32.261 07:26:42 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:32.261 07:26:42 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:32.261 07:26:42 -- scripts/common.sh@367 -- $ return 0 00:01:32.261 07:26:42 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:32.261 patching file lib/pcapng/rte_pcapng.c 00:01:32.261 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:32.261 07:26:42 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:32.261 07:26:42 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:32.261 07:26:42 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:32.261 07:26:42 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:32.261 07:26:42 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:36.458 The Meson build system 00:01:36.458 Version: 1.5.0 00:01:36.458 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:36.458 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:36.458 Build type: native build 00:01:36.458 Program cat found: YES (/usr/bin/cat) 00:01:36.458 Project name: DPDK 00:01:36.458 Project version: 22.11.4 00:01:36.458 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:36.458 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:36.458 Host machine cpu family: x86_64 00:01:36.458 Host machine cpu: x86_64 00:01:36.458 Message: ## Building in Developer Mode ## 00:01:36.458 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:36.458 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:36.458 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:36.458 Program objdump found: YES (/usr/bin/objdump) 00:01:36.458 Program python3 found: YES (/usr/bin/python3) 00:01:36.458 Program cat found: YES (/usr/bin/cat) 00:01:36.458 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:36.458 Checking for size of "void *" : 8 00:01:36.458 Checking for size of "void *" : 8 (cached) 00:01:36.458 Library m found: YES 00:01:36.458 Library numa found: YES 00:01:36.458 Has header "numaif.h" : YES 00:01:36.458 Library fdt found: NO 00:01:36.458 Library execinfo found: NO 00:01:36.458 Has header "execinfo.h" : YES 00:01:36.458 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:36.458 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:36.458 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:36.458 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:36.458 Run-time dependency openssl found: YES 3.1.1 00:01:36.458 Run-time dependency libpcap found: YES 1.10.4 00:01:36.458 Has header "pcap.h" with dependency libpcap: YES 00:01:36.458 Compiler for C supports arguments -Wcast-qual: YES 00:01:36.458 Compiler for C supports arguments -Wdeprecated: YES 00:01:36.458 Compiler for C supports arguments -Wformat: YES 00:01:36.458 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:36.458 Compiler for C supports arguments -Wformat-security: NO 00:01:36.459 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:36.459 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:36.459 Compiler for C supports arguments -Wnested-externs: YES 00:01:36.459 Compiler for C supports arguments -Wold-style-definition: YES 00:01:36.459 Compiler for C supports arguments -Wpointer-arith: YES 00:01:36.459 Compiler for C supports arguments -Wsign-compare: YES 00:01:36.459 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:36.459 Compiler for C supports arguments -Wundef: YES 00:01:36.459 Compiler for C supports arguments -Wwrite-strings: YES 00:01:36.459 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:36.459 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:36.459 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:36.459 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:36.459 Compiler for C supports arguments -mavx512f: YES 00:01:36.459 Checking if "AVX512 checking" compiles: YES 00:01:36.459 Fetching value of define "__SSE4_2__" : 1 00:01:36.459 Fetching value of define "__AES__" : 1 00:01:36.459 Fetching value of define "__AVX__" : 1 00:01:36.459 Fetching value of define "__AVX2__" : 1 00:01:36.459 Fetching value of define "__AVX512BW__" : 1 00:01:36.459 Fetching value of define "__AVX512CD__" : 1 00:01:36.459 Fetching value of define "__AVX512DQ__" : 1 00:01:36.459 Fetching value of define "__AVX512F__" : 1 00:01:36.459 Fetching value of define "__AVX512VL__" : 1 00:01:36.459 Fetching value of define "__PCLMUL__" : 1 00:01:36.459 Fetching value of define "__RDRND__" : 1 00:01:36.459 Fetching value of define "__RDSEED__" : 1 00:01:36.459 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:36.459 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:36.459 Message: lib/kvargs: Defining dependency "kvargs" 00:01:36.459 Message: lib/telemetry: Defining dependency "telemetry" 00:01:36.459 Checking for function "getentropy" : YES 00:01:36.459 Message: lib/eal: Defining dependency "eal" 00:01:36.459 Message: lib/ring: Defining dependency "ring" 00:01:36.459 Message: lib/rcu: Defining dependency "rcu" 00:01:36.459 Message: lib/mempool: Defining dependency "mempool" 00:01:36.459 Message: lib/mbuf: Defining dependency "mbuf" 00:01:36.459 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:36.459 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:36.459 Compiler for C supports arguments -mpclmul: YES 00:01:36.459 Compiler for C supports arguments -maes: YES 00:01:36.459 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:36.459 Compiler for C supports arguments -mavx512bw: YES 00:01:36.459 Compiler for C supports arguments -mavx512dq: YES 00:01:36.459 Compiler for C supports arguments -mavx512vl: YES 00:01:36.459 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:36.459 Compiler for C supports arguments -mavx2: YES 00:01:36.459 Compiler for C supports arguments -mavx: YES 00:01:36.459 Message: lib/net: Defining dependency "net" 00:01:36.459 Message: lib/meter: Defining dependency "meter" 00:01:36.459 Message: lib/ethdev: Defining dependency "ethdev" 00:01:36.459 Message: lib/pci: Defining dependency "pci" 00:01:36.459 Message: lib/cmdline: Defining dependency "cmdline" 00:01:36.459 Message: lib/metrics: Defining dependency "metrics" 00:01:36.459 Message: lib/hash: Defining dependency "hash" 00:01:36.459 Message: lib/timer: Defining dependency "timer" 00:01:36.459 Fetching value of define "__AVX2__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:36.459 Message: lib/acl: Defining dependency "acl" 00:01:36.459 Message: lib/bbdev: Defining dependency "bbdev" 00:01:36.459 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:36.459 Run-time dependency libelf found: YES 0.191 00:01:36.459 Message: lib/bpf: Defining dependency "bpf" 00:01:36.459 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:36.459 Message: lib/compressdev: Defining dependency "compressdev" 00:01:36.459 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:36.459 Message: lib/distributor: Defining dependency "distributor" 00:01:36.459 Message: lib/efd: Defining dependency "efd" 00:01:36.459 Message: lib/eventdev: Defining dependency "eventdev" 00:01:36.459 Message: lib/gpudev: Defining dependency "gpudev" 00:01:36.459 Message: lib/gro: Defining dependency "gro" 00:01:36.459 Message: lib/gso: Defining dependency "gso" 00:01:36.459 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:36.459 Message: lib/jobstats: Defining dependency "jobstats" 00:01:36.459 Message: lib/latencystats: Defining dependency "latencystats" 00:01:36.459 Message: lib/lpm: Defining dependency "lpm" 00:01:36.459 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:36.459 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:36.459 Message: lib/member: Defining dependency "member" 00:01:36.459 Message: lib/pcapng: Defining dependency "pcapng" 00:01:36.459 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:36.459 Message: lib/power: Defining dependency "power" 00:01:36.459 Message: lib/rawdev: Defining dependency "rawdev" 00:01:36.459 Message: lib/regexdev: Defining dependency "regexdev" 00:01:36.459 Message: lib/dmadev: Defining dependency "dmadev" 00:01:36.459 Message: lib/rib: Defining dependency "rib" 00:01:36.459 Message: lib/reorder: Defining dependency "reorder" 00:01:36.459 Message: lib/sched: Defining dependency "sched" 00:01:36.459 Message: lib/security: Defining dependency "security" 00:01:36.459 Message: lib/stack: Defining dependency "stack" 00:01:36.459 Has header "linux/userfaultfd.h" : YES 00:01:36.459 Message: lib/vhost: Defining dependency "vhost" 00:01:36.459 Message: lib/ipsec: Defining dependency "ipsec" 00:01:36.459 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:36.459 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:36.459 Message: lib/fib: Defining dependency "fib" 00:01:36.459 Message: lib/port: Defining dependency "port" 00:01:36.459 Message: lib/pdump: Defining dependency "pdump" 00:01:36.459 Message: lib/table: Defining dependency "table" 00:01:36.459 Message: lib/pipeline: Defining dependency "pipeline" 00:01:36.459 Message: lib/graph: Defining dependency "graph" 00:01:36.459 Message: lib/node: Defining dependency "node" 00:01:36.459 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:36.459 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:36.459 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:36.459 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:36.459 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:36.459 Compiler for C supports arguments -Wno-unused-value: YES 00:01:36.459 Compiler for C supports arguments -Wno-format: YES 00:01:36.459 Compiler for C supports arguments -Wno-format-security: YES 00:01:36.459 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:37.397 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:37.397 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:37.397 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:37.397 Fetching value of define "__AVX2__" : 1 (cached) 00:01:37.397 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:37.397 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:37.397 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:37.397 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:37.397 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:37.397 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:37.397 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:37.397 Configuring doxy-api.conf using configuration 00:01:37.397 Program sphinx-build found: NO 00:01:37.397 Configuring rte_build_config.h using configuration 00:01:37.397 Message: 00:01:37.397 ================= 00:01:37.397 Applications Enabled 00:01:37.397 ================= 00:01:37.397 00:01:37.397 apps: 00:01:37.397 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:37.397 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:37.397 test-security-perf, 00:01:37.397 00:01:37.397 Message: 00:01:37.397 ================= 00:01:37.397 Libraries Enabled 00:01:37.397 ================= 00:01:37.397 00:01:37.397 libs: 00:01:37.397 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:37.397 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:37.397 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:37.397 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:37.397 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:37.397 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:37.397 table, pipeline, graph, node, 00:01:37.397 00:01:37.397 Message: 00:01:37.397 =============== 00:01:37.397 Drivers Enabled 00:01:37.397 =============== 00:01:37.397 00:01:37.397 common: 00:01:37.397 00:01:37.397 bus: 00:01:37.397 pci, vdev, 00:01:37.397 mempool: 00:01:37.397 ring, 00:01:37.397 dma: 00:01:37.397 00:01:37.397 net: 00:01:37.397 i40e, 00:01:37.397 raw: 00:01:37.397 00:01:37.397 crypto: 00:01:37.397 00:01:37.397 compress: 00:01:37.397 00:01:37.397 regex: 00:01:37.397 00:01:37.397 vdpa: 00:01:37.397 00:01:37.397 event: 00:01:37.397 00:01:37.397 baseband: 00:01:37.397 00:01:37.397 gpu: 00:01:37.397 00:01:37.397 00:01:37.397 Message: 00:01:37.397 ================= 00:01:37.397 Content Skipped 00:01:37.397 ================= 00:01:37.397 00:01:37.397 apps: 00:01:37.397 00:01:37.397 libs: 00:01:37.397 kni: explicitly disabled via build config (deprecated lib) 00:01:37.397 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:37.397 00:01:37.397 drivers: 00:01:37.397 common/cpt: not in enabled drivers build config 00:01:37.397 common/dpaax: not in enabled drivers build config 00:01:37.397 common/iavf: not in enabled drivers build config 00:01:37.397 common/idpf: not in enabled drivers build config 00:01:37.397 common/mvep: not in enabled drivers build config 00:01:37.397 common/octeontx: not in enabled drivers build config 00:01:37.397 bus/auxiliary: not in enabled drivers build config 00:01:37.397 bus/dpaa: not in enabled drivers build config 00:01:37.397 bus/fslmc: not in enabled drivers build config 00:01:37.397 bus/ifpga: not in enabled drivers build config 00:01:37.397 bus/vmbus: not in enabled drivers build config 00:01:37.397 common/cnxk: not in enabled drivers build config 00:01:37.397 common/mlx5: not in enabled drivers build config 00:01:37.397 common/qat: not in enabled drivers build config 00:01:37.397 common/sfc_efx: not in enabled drivers build config 00:01:37.397 mempool/bucket: not in enabled drivers build config 00:01:37.397 mempool/cnxk: not in enabled drivers build config 00:01:37.397 mempool/dpaa: not in enabled drivers build config 00:01:37.397 mempool/dpaa2: not in enabled drivers build config 00:01:37.397 mempool/octeontx: not in enabled drivers build config 00:01:37.397 mempool/stack: not in enabled drivers build config 00:01:37.397 dma/cnxk: not in enabled drivers build config 00:01:37.397 dma/dpaa: not in enabled drivers build config 00:01:37.397 dma/dpaa2: not in enabled drivers build config 00:01:37.397 dma/hisilicon: not in enabled drivers build config 00:01:37.397 dma/idxd: not in enabled drivers build config 00:01:37.397 dma/ioat: not in enabled drivers build config 00:01:37.397 dma/skeleton: not in enabled drivers build config 00:01:37.397 net/af_packet: not in enabled drivers build config 00:01:37.397 net/af_xdp: not in enabled drivers build config 00:01:37.397 net/ark: not in enabled drivers build config 00:01:37.397 net/atlantic: not in enabled drivers build config 00:01:37.397 net/avp: not in enabled drivers build config 00:01:37.397 net/axgbe: not in enabled drivers build config 00:01:37.397 net/bnx2x: not in enabled drivers build config 00:01:37.397 net/bnxt: not in enabled drivers build config 00:01:37.397 net/bonding: not in enabled drivers build config 00:01:37.397 net/cnxk: not in enabled drivers build config 00:01:37.397 net/cxgbe: not in enabled drivers build config 00:01:37.397 net/dpaa: not in enabled drivers build config 00:01:37.397 net/dpaa2: not in enabled drivers build config 00:01:37.397 net/e1000: not in enabled drivers build config 00:01:37.397 net/ena: not in enabled drivers build config 00:01:37.397 net/enetc: not in enabled drivers build config 00:01:37.397 net/enetfec: not in enabled drivers build config 00:01:37.397 net/enic: not in enabled drivers build config 00:01:37.397 net/failsafe: not in enabled drivers build config 00:01:37.397 net/fm10k: not in enabled drivers build config 00:01:37.397 net/gve: not in enabled drivers build config 00:01:37.397 net/hinic: not in enabled drivers build config 00:01:37.397 net/hns3: not in enabled drivers build config 00:01:37.397 net/iavf: not in enabled drivers build config 00:01:37.397 net/ice: not in enabled drivers build config 00:01:37.397 net/idpf: not in enabled drivers build config 00:01:37.397 net/igc: not in enabled drivers build config 00:01:37.397 net/ionic: not in enabled drivers build config 00:01:37.397 net/ipn3ke: not in enabled drivers build config 00:01:37.397 net/ixgbe: not in enabled drivers build config 00:01:37.397 net/kni: not in enabled drivers build config 00:01:37.397 net/liquidio: not in enabled drivers build config 00:01:37.397 net/mana: not in enabled drivers build config 00:01:37.397 net/memif: not in enabled drivers build config 00:01:37.397 net/mlx4: not in enabled drivers build config 00:01:37.397 net/mlx5: not in enabled drivers build config 00:01:37.397 net/mvneta: not in enabled drivers build config 00:01:37.397 net/mvpp2: not in enabled drivers build config 00:01:37.397 net/netvsc: not in enabled drivers build config 00:01:37.397 net/nfb: not in enabled drivers build config 00:01:37.397 net/nfp: not in enabled drivers build config 00:01:37.398 net/ngbe: not in enabled drivers build config 00:01:37.398 net/null: not in enabled drivers build config 00:01:37.398 net/octeontx: not in enabled drivers build config 00:01:37.398 net/octeon_ep: not in enabled drivers build config 00:01:37.398 net/pcap: not in enabled drivers build config 00:01:37.398 net/pfe: not in enabled drivers build config 00:01:37.398 net/qede: not in enabled drivers build config 00:01:37.398 net/ring: not in enabled drivers build config 00:01:37.398 net/sfc: not in enabled drivers build config 00:01:37.398 net/softnic: not in enabled drivers build config 00:01:37.398 net/tap: not in enabled drivers build config 00:01:37.398 net/thunderx: not in enabled drivers build config 00:01:37.398 net/txgbe: not in enabled drivers build config 00:01:37.398 net/vdev_netvsc: not in enabled drivers build config 00:01:37.398 net/vhost: not in enabled drivers build config 00:01:37.398 net/virtio: not in enabled drivers build config 00:01:37.398 net/vmxnet3: not in enabled drivers build config 00:01:37.398 raw/cnxk_bphy: not in enabled drivers build config 00:01:37.398 raw/cnxk_gpio: not in enabled drivers build config 00:01:37.398 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:37.398 raw/ifpga: not in enabled drivers build config 00:01:37.398 raw/ntb: not in enabled drivers build config 00:01:37.398 raw/skeleton: not in enabled drivers build config 00:01:37.398 crypto/armv8: not in enabled drivers build config 00:01:37.398 crypto/bcmfs: not in enabled drivers build config 00:01:37.398 crypto/caam_jr: not in enabled drivers build config 00:01:37.398 crypto/ccp: not in enabled drivers build config 00:01:37.398 crypto/cnxk: not in enabled drivers build config 00:01:37.398 crypto/dpaa_sec: not in enabled drivers build config 00:01:37.398 crypto/dpaa2_sec: not in enabled drivers build config 00:01:37.398 crypto/ipsec_mb: not in enabled drivers build config 00:01:37.398 crypto/mlx5: not in enabled drivers build config 00:01:37.398 crypto/mvsam: not in enabled drivers build config 00:01:37.398 crypto/nitrox: not in enabled drivers build config 00:01:37.398 crypto/null: not in enabled drivers build config 00:01:37.398 crypto/octeontx: not in enabled drivers build config 00:01:37.398 crypto/openssl: not in enabled drivers build config 00:01:37.398 crypto/scheduler: not in enabled drivers build config 00:01:37.398 crypto/uadk: not in enabled drivers build config 00:01:37.398 crypto/virtio: not in enabled drivers build config 00:01:37.398 compress/isal: not in enabled drivers build config 00:01:37.398 compress/mlx5: not in enabled drivers build config 00:01:37.398 compress/octeontx: not in enabled drivers build config 00:01:37.398 compress/zlib: not in enabled drivers build config 00:01:37.398 regex/mlx5: not in enabled drivers build config 00:01:37.398 regex/cn9k: not in enabled drivers build config 00:01:37.398 vdpa/ifc: not in enabled drivers build config 00:01:37.398 vdpa/mlx5: not in enabled drivers build config 00:01:37.398 vdpa/sfc: not in enabled drivers build config 00:01:37.398 event/cnxk: not in enabled drivers build config 00:01:37.398 event/dlb2: not in enabled drivers build config 00:01:37.398 event/dpaa: not in enabled drivers build config 00:01:37.398 event/dpaa2: not in enabled drivers build config 00:01:37.398 event/dsw: not in enabled drivers build config 00:01:37.398 event/opdl: not in enabled drivers build config 00:01:37.398 event/skeleton: not in enabled drivers build config 00:01:37.398 event/sw: not in enabled drivers build config 00:01:37.398 event/octeontx: not in enabled drivers build config 00:01:37.398 baseband/acc: not in enabled drivers build config 00:01:37.398 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:37.398 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:37.398 baseband/la12xx: not in enabled drivers build config 00:01:37.398 baseband/null: not in enabled drivers build config 00:01:37.398 baseband/turbo_sw: not in enabled drivers build config 00:01:37.398 gpu/cuda: not in enabled drivers build config 00:01:37.398 00:01:37.398 00:01:37.398 Build targets in project: 311 00:01:37.398 00:01:37.398 DPDK 22.11.4 00:01:37.398 00:01:37.398 User defined options 00:01:37.398 libdir : lib 00:01:37.398 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.398 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:37.398 c_link_args : 00:01:37.398 enable_docs : false 00:01:37.398 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:37.398 enable_kmods : false 00:01:37.398 machine : native 00:01:37.398 tests : false 00:01:37.398 00:01:37.398 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:37.398 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:37.663 07:26:48 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:37.663 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:37.663 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:37.663 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:37.663 [3/740] Generating lib/rte_telemetry_def with a custom command 00:01:37.663 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:37.663 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:37.663 [6/740] Generating lib/rte_eal_mingw with a custom command 00:01:37.663 [7/740] Generating lib/rte_rcu_mingw with a custom command 00:01:37.663 [8/740] Generating lib/rte_eal_def with a custom command 00:01:37.663 [9/740] Generating lib/rte_ring_def with a custom command 00:01:37.663 [10/740] Generating lib/rte_rcu_def with a custom command 00:01:37.663 [11/740] Generating lib/rte_ring_mingw with a custom command 00:01:37.663 [12/740] Generating lib/rte_mempool_def with a custom command 00:01:37.663 [13/740] Generating lib/rte_mempool_mingw with a custom command 00:01:37.927 [14/740] Generating lib/rte_mbuf_def with a custom command 00:01:37.927 [15/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:37.927 [16/740] Generating lib/rte_net_mingw with a custom command 00:01:37.927 [17/740] Generating lib/rte_net_def with a custom command 00:01:37.927 [18/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:37.927 [19/740] Generating lib/rte_meter_def with a custom command 00:01:37.927 [20/740] Generating lib/rte_meter_mingw with a custom command 00:01:37.927 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:37.927 [22/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:37.927 [23/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:37.927 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:37.927 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:37.927 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:37.927 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:37.927 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:37.927 [29/740] Generating lib/rte_ethdev_def with a custom command 00:01:37.927 [30/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:37.927 [31/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:37.927 [32/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:37.927 [33/740] Generating lib/rte_pci_def with a custom command 00:01:37.927 [34/740] Generating lib/rte_pci_mingw with a custom command 00:01:37.927 [35/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:37.927 [36/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:37.927 [37/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:37.927 [38/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:37.927 [39/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:37.927 [40/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:37.927 [41/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:37.927 [42/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:37.927 [43/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:37.927 [44/740] Linking static target lib/librte_kvargs.a 00:01:37.927 [45/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:37.927 [46/740] Generating lib/rte_cmdline_def with a custom command 00:01:37.927 [47/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:37.927 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:37.927 [49/740] Generating lib/rte_metrics_mingw with a custom command 00:01:37.927 [50/740] Generating lib/rte_metrics_def with a custom command 00:01:37.927 [51/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:37.927 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:37.927 [53/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:37.927 [54/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:37.927 [55/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:37.927 [56/740] Generating lib/rte_hash_def with a custom command 00:01:37.927 [57/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:37.927 [58/740] Generating lib/rte_hash_mingw with a custom command 00:01:37.927 [59/740] Generating lib/rte_timer_mingw with a custom command 00:01:37.927 [60/740] Generating lib/rte_timer_def with a custom command 00:01:37.927 [61/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:37.927 [62/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:37.927 [63/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:37.927 [64/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:37.927 [65/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:37.927 [66/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:37.927 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:37.927 [68/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:37.927 [69/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:37.927 [70/740] Generating lib/rte_bbdev_def with a custom command 00:01:37.927 [71/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:37.927 [72/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:37.927 [73/740] Generating lib/rte_acl_mingw with a custom command 00:01:37.927 [74/740] Generating lib/rte_bitratestats_def with a custom command 00:01:37.927 [75/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:37.927 [76/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:37.927 [77/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:37.927 [78/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:37.927 [79/740] Generating lib/rte_acl_def with a custom command 00:01:37.927 [80/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:37.928 [81/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:37.928 [82/740] Linking static target lib/librte_pci.a 00:01:37.928 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:37.928 [84/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:37.928 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:37.928 [86/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:38.188 [87/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:38.188 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:38.188 [89/740] Generating lib/rte_bpf_def with a custom command 00:01:38.188 [90/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:38.188 [91/740] Generating lib/rte_bpf_mingw with a custom command 00:01:38.188 [92/740] Generating lib/rte_cfgfile_def with a custom command 00:01:38.188 [93/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:38.188 [94/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:38.188 [95/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:38.188 [96/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:38.188 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:38.188 [98/740] Generating lib/rte_compressdev_def with a custom command 00:01:38.188 [99/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:38.188 [100/740] Linking static target lib/librte_meter.a 00:01:38.188 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:38.188 [102/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:38.188 [103/740] Linking static target lib/librte_ring.a 00:01:38.188 [104/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:38.188 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:38.188 [106/740] Generating lib/rte_cryptodev_def with a custom command 00:01:38.188 [107/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:38.188 [108/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:38.188 [109/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:38.188 [110/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:38.188 [111/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:38.189 [112/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:38.189 [113/740] Generating lib/rte_distributor_mingw with a custom command 00:01:38.189 [114/740] Generating lib/rte_efd_mingw with a custom command 00:01:38.189 [115/740] Generating lib/rte_efd_def with a custom command 00:01:38.189 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:38.189 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:38.189 [118/740] Generating lib/rte_distributor_def with a custom command 00:01:38.189 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:38.189 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:38.189 [121/740] Generating lib/rte_eventdev_def with a custom command 00:01:38.189 [122/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:38.189 [123/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:38.189 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:38.189 [125/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:38.189 [126/740] Generating lib/rte_gpudev_def with a custom command 00:01:38.189 [127/740] Generating lib/rte_gro_def with a custom command 00:01:38.189 [128/740] Generating lib/rte_gro_mingw with a custom command 00:01:38.189 [129/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:38.189 [130/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:38.189 [131/740] Generating lib/rte_gso_def with a custom command 00:01:38.189 [132/740] Generating lib/rte_gso_mingw with a custom command 00:01:38.189 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:38.454 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:38.454 [135/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.454 [136/740] Generating lib/rte_ip_frag_def with a custom command 00:01:38.454 [137/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:38.454 [138/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:38.454 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.454 [140/740] Linking target lib/librte_kvargs.so.23.0 00:01:38.454 [141/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:38.454 [142/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:38.454 [143/740] Generating lib/rte_jobstats_def with a custom command 00:01:38.454 [144/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:38.454 [145/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.454 [146/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:38.454 [147/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:38.454 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:38.454 [149/740] Linking static target lib/librte_cfgfile.a 00:01:38.454 [150/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:38.454 [151/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:38.454 [152/740] Generating lib/rte_latencystats_def with a custom command 00:01:38.454 [153/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:38.454 [154/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:38.454 [155/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:38.454 [156/740] Generating lib/rte_lpm_def with a custom command 00:01:38.454 [157/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:38.454 [158/740] Generating lib/rte_lpm_mingw with a custom command 00:01:38.454 [159/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:38.454 [160/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:38.454 [161/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.454 [162/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:38.454 [163/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:38.454 [164/740] Generating lib/rte_member_def with a custom command 00:01:38.454 [165/740] Generating lib/rte_member_mingw with a custom command 00:01:38.454 [166/740] Generating lib/rte_pcapng_def with a custom command 00:01:38.454 [167/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:38.454 [168/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:38.454 [169/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:38.454 [170/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:38.454 [171/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:38.454 [172/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:38.454 [173/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:38.454 [174/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:38.454 [175/740] Linking static target lib/librte_jobstats.a 00:01:38.454 [176/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:38.713 [177/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:38.713 [178/740] Linking static target lib/librte_cmdline.a 00:01:38.714 [179/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:38.714 [180/740] Generating lib/rte_power_mingw with a custom command 00:01:38.714 [181/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:38.714 [182/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:38.714 [183/740] Generating lib/rte_power_def with a custom command 00:01:38.714 [184/740] Linking static target lib/librte_timer.a 00:01:38.714 [185/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:38.714 [186/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:38.714 [187/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:38.714 [188/740] Generating lib/rte_rawdev_def with a custom command 00:01:38.714 [189/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:38.714 [190/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:38.714 [191/740] Linking static target lib/librte_metrics.a 00:01:38.714 [192/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:38.714 [193/740] Linking static target lib/librte_telemetry.a 00:01:38.714 [194/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:38.714 [195/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:38.714 [196/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:38.714 [197/740] Generating lib/rte_regexdev_def with a custom command 00:01:38.714 [198/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:38.714 [199/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:38.714 [200/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:38.714 [201/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:38.714 [202/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:38.714 [203/740] Generating lib/rte_dmadev_def with a custom command 00:01:38.714 [204/740] Generating lib/rte_rib_def with a custom command 00:01:38.714 [205/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:38.714 [206/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:38.714 [207/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:38.714 [208/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:38.714 [209/740] Generating lib/rte_rib_mingw with a custom command 00:01:38.714 [210/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:38.714 [211/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:38.714 [212/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:38.714 [213/740] Generating lib/rte_reorder_def with a custom command 00:01:38.714 [214/740] Generating lib/rte_reorder_mingw with a custom command 00:01:38.714 [215/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:38.714 [216/740] Generating lib/rte_sched_def with a custom command 00:01:38.714 [217/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:38.714 [218/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:38.714 [219/740] Generating lib/rte_sched_mingw with a custom command 00:01:38.714 [220/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:38.714 [221/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:38.714 [222/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:38.714 [223/740] Generating lib/rte_security_def with a custom command 00:01:38.714 [224/740] Generating lib/rte_security_mingw with a custom command 00:01:38.714 [225/740] Generating lib/rte_stack_def with a custom command 00:01:38.714 [226/740] Linking static target lib/librte_bitratestats.a 00:01:38.714 [227/740] Generating lib/rte_stack_mingw with a custom command 00:01:38.714 [228/740] Linking static target lib/librte_net.a 00:01:38.714 [229/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:38.714 [230/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:38.714 [231/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:38.714 [232/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:38.714 [233/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:38.714 [234/740] Generating lib/rte_vhost_def with a custom command 00:01:38.714 [235/740] Generating lib/rte_vhost_mingw with a custom command 00:01:38.714 [236/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:38.714 [237/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:38.714 [238/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:38.714 [239/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:38.714 [240/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:38.714 [241/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:38.714 [242/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:38.714 [243/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:38.714 [244/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:38.714 [245/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:38.981 [246/740] Generating lib/rte_ipsec_def with a custom command 00:01:38.981 [247/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:38.981 [248/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:38.981 [249/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:38.981 [250/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:38.981 [251/740] Generating lib/rte_fib_def with a custom command 00:01:38.981 [252/740] Generating lib/rte_fib_mingw with a custom command 00:01:38.981 [253/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:38.981 [254/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:38.981 [255/740] Linking static target lib/librte_stack.a 00:01:38.981 [256/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:38.981 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:38.981 [258/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:38.981 [259/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:38.981 [260/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:38.981 [261/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:38.981 [262/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:38.981 [263/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:38.981 [264/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:38.981 [265/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:38.981 [266/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:38.981 [267/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:38.981 [268/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.981 [269/740] Generating lib/rte_port_def with a custom command 00:01:38.981 [270/740] Generating lib/rte_port_mingw with a custom command 00:01:38.981 [271/740] Generating lib/rte_pdump_mingw with a custom command 00:01:38.981 [272/740] Linking static target lib/librte_compressdev.a 00:01:38.981 [273/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:38.981 [274/740] Generating lib/rte_pdump_def with a custom command 00:01:38.981 [275/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:38.981 [276/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:38.981 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:38.981 [278/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.981 [279/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:38.981 [280/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:38.981 [281/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:38.981 [282/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:38.981 [283/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:38.981 [284/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.981 [285/740] Linking static target lib/librte_rawdev.a 00:01:38.981 [286/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:38.981 [287/740] Linking static target lib/librte_rcu.a 00:01:38.981 [288/740] Linking static target lib/librte_mempool.a 00:01:39.243 [289/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:39.243 [290/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:39.243 [291/740] Generating lib/rte_table_mingw with a custom command 00:01:39.243 [292/740] Generating lib/rte_table_def with a custom command 00:01:39.243 [293/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:39.243 [294/740] Linking static target lib/librte_bbdev.a 00:01:39.243 [295/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:39.243 [296/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:39.243 [297/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:39.243 [298/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.243 [299/740] Linking static target lib/librte_dmadev.a 00:01:39.243 [300/740] Linking static target lib/librte_gpudev.a 00:01:39.243 [301/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:39.243 [302/740] Linking static target lib/librte_gro.a 00:01:39.243 [303/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:39.243 [304/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:39.243 [305/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:39.243 [306/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.243 [307/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.243 [308/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.243 [309/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:39.243 [310/740] Generating lib/rte_pipeline_def with a custom command 00:01:39.243 [311/740] Linking static target lib/librte_gso.a 00:01:39.243 [312/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:39.243 [313/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:39.243 [314/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:39.243 [315/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.243 [316/740] Linking static target lib/librte_latencystats.a 00:01:39.243 [317/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:39.243 [318/740] Linking target lib/librte_telemetry.so.23.0 00:01:39.243 [319/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:39.243 [320/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:39.243 [321/740] Generating lib/rte_graph_def with a custom command 00:01:39.243 [322/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:39.243 [323/740] Generating lib/rte_graph_mingw with a custom command 00:01:39.243 [324/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:39.243 [325/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:39.243 [326/740] Linking static target lib/librte_distributor.a 00:01:39.243 [327/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:39.243 [328/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:39.512 [329/740] Linking static target lib/librte_ip_frag.a 00:01:39.512 [330/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:39.512 [331/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:39.512 [332/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:39.512 [333/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:39.512 [334/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:39.512 [335/740] Linking static target lib/librte_regexdev.a 00:01:39.512 [336/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:39.512 [337/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:39.512 [338/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:39.512 [339/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:39.512 [340/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:39.512 [341/740] Generating lib/rte_node_def with a custom command 00:01:39.512 [342/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:39.512 [343/740] Generating lib/rte_node_mingw with a custom command 00:01:39.512 [344/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:39.512 [345/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:39.512 [346/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.512 [347/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:39.512 [348/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:39.513 [349/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:39.513 [350/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:39.513 [351/740] Linking static target lib/librte_eal.a 00:01:39.513 [352/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.513 [353/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:39.513 [354/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:39.513 [355/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:39.513 [356/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:39.513 [357/740] Linking static target lib/librte_reorder.a 00:01:39.513 [358/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:39.513 [359/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:39.513 [360/740] Linking static target lib/librte_power.a 00:01:39.513 [361/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:39.513 [362/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:39.513 [363/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.513 [364/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:39.513 [365/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:39.513 [366/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.513 [367/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:39.513 [368/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:39.773 [369/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:39.773 [370/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:39.773 [371/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:39.773 [372/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:39.773 [373/740] Linking static target lib/librte_pcapng.a 00:01:39.773 [374/740] Linking static target lib/librte_security.a 00:01:39.773 [375/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:39.773 [376/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:39.773 [377/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:39.773 [378/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:39.773 [379/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:39.773 [380/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:39.773 [381/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:39.773 [382/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:39.773 [383/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.773 [384/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.773 [385/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:39.773 [386/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:39.773 [387/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:39.773 [388/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:39.773 [389/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:39.773 [390/740] Linking static target lib/librte_mbuf.a 00:01:39.773 [391/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.773 [392/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:39.773 [393/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:39.773 [394/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:39.773 [395/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:39.773 [396/740] Linking static target lib/librte_bpf.a 00:01:39.773 [397/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:39.773 [398/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:40.037 [399/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:40.037 [400/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:40.037 [401/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:40.037 [402/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:40.037 [403/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:40.037 [404/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:40.037 [405/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:40.037 [406/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:40.037 [407/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:40.037 [408/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:40.037 [409/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:40.037 [410/740] Linking static target lib/librte_lpm.a 00:01:40.037 [411/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.037 [412/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:40.037 [413/740] Linking static target lib/librte_rib.a 00:01:40.037 [414/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:40.037 [415/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:40.037 [416/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:40.037 [417/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:40.037 [418/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:40.037 [419/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:40.037 [420/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:40.037 [421/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:40.037 [422/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.037 [423/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:40.037 [424/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:40.037 [425/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:40.037 [426/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.037 [427/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:40.037 [428/740] Linking static target lib/librte_graph.a 00:01:40.037 [429/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:40.037 [430/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.037 [431/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:40.037 [432/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:40.037 [433/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:40.037 [434/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:40.037 [435/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:40.037 [436/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:40.298 [437/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:40.298 [438/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:40.298 [439/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.298 [440/740] Linking static target lib/librte_efd.a 00:01:40.298 [441/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:40.298 [442/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:40.298 [443/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:40.298 [444/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:40.298 [445/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:40.298 [446/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:40.298 [447/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:40.298 [448/740] Linking static target drivers/librte_bus_vdev.a 00:01:40.298 [449/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.298 [450/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:40.298 [451/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:40.298 [452/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:40.298 [453/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:40.298 [454/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:40.298 [455/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:40.298 [456/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:40.559 [457/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.559 [458/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:40.559 [459/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.559 [460/740] Linking static target lib/librte_fib.a 00:01:40.559 [461/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.559 [462/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:40.559 [463/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:40.559 [464/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:40.559 [465/740] Linking static target lib/librte_pdump.a 00:01:40.559 [466/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:40.559 [467/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.559 [468/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.559 [469/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.559 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:40.559 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:40.559 [472/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.822 [473/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:40.822 [474/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.822 [475/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:40.822 [476/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:40.822 [477/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:40.822 [478/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.822 [479/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:40.822 [480/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:40.822 [481/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:40.822 [482/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.822 [483/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:40.822 [484/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:40.822 [485/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:40.822 [486/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:40.822 [487/740] Linking static target drivers/librte_bus_pci.a 00:01:40.822 [488/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:40.822 [489/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:40.822 [490/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:40.822 [491/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:40.822 [492/740] Linking static target lib/librte_table.a 00:01:40.822 [493/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:40.822 [494/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:40.822 [495/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:40.822 [496/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:40.822 [497/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:41.081 [498/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:41.081 [499/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:41.081 [500/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:41.081 [501/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:41.081 [502/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:41.081 [503/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.081 [504/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:41.081 [505/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:41.081 [506/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.081 [507/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:41.081 [508/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:41.081 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:41.081 [510/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:41.081 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:41.081 [512/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:41.081 [513/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:41.081 [514/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:41.081 [515/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:41.081 [516/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:41.081 [517/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.081 [518/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:41.081 [519/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:41.081 [520/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:41.081 [521/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:41.081 [522/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:41.081 [523/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:41.081 [524/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:41.081 [525/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:41.081 [526/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:41.340 [527/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:41.340 [528/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:41.340 [529/740] Linking static target lib/librte_sched.a 00:01:41.340 [530/740] Linking static target lib/librte_cryptodev.a 00:01:41.340 [531/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:41.340 [532/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:41.341 [533/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:41.341 [534/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:41.341 [535/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:41.341 [536/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.341 [537/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:41.341 [538/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:41.341 [539/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:41.341 [540/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:41.341 [541/740] Linking static target lib/librte_node.a 00:01:41.341 [542/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.341 [543/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:41.341 [544/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:41.341 [545/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:41.341 [546/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:41.341 [547/740] Linking static target lib/librte_ipsec.a 00:01:41.341 [548/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:41.341 [549/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:41.341 [550/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:41.341 [551/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:41.341 [552/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:41.341 [553/740] Linking static target drivers/librte_mempool_ring.a 00:01:41.341 [554/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:41.599 [555/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:41.599 [556/740] Linking static target lib/librte_ethdev.a 00:01:41.599 [557/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:41.599 [558/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:41.599 [559/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:41.599 [560/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:41.599 [561/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:41.599 [562/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:41.599 [563/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:41.599 [564/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:41.599 [565/740] Linking static target lib/librte_port.a 00:01:41.599 [566/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:41.599 [567/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:41.599 [568/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:41.599 [569/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:41.599 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:41.599 [571/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:41.599 [572/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:41.599 [573/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:41.599 [574/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.599 [575/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:41.599 [576/740] Linking static target lib/librte_eventdev.a 00:01:41.599 [577/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:41.599 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:41.599 [579/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:41.599 [580/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:41.599 [581/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:41.599 [582/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:41.599 [583/740] Linking static target lib/librte_member.a 00:01:41.599 [584/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:41.599 [585/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:41.859 [586/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:41.859 [587/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:41.859 [588/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:41.859 [589/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:41.859 [590/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:41.859 [591/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:41.859 [592/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.859 [593/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.859 [594/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:41.859 [595/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.859 [596/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:41.859 [597/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:41.859 [598/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:42.119 [599/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:42.119 [600/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:42.119 [601/740] Linking static target lib/librte_hash.a 00:01:42.119 [602/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:42.119 [603/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:42.119 [604/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:42.119 [605/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:42.119 [606/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:42.119 [607/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:42.119 [608/740] Linking static target lib/librte_acl.a 00:01:42.119 [609/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.119 [610/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:42.119 [611/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:42.377 [612/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.377 [613/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:42.637 [614/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.637 [615/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:42.637 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:42.897 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:43.156 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.156 [619/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:43.414 [620/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:43.673 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:43.932 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:43.932 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:44.190 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:44.190 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:44.190 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:44.451 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:45.017 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:45.017 [629/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.017 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.017 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:45.285 [632/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:45.285 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.544 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.112 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:51.112 [636/740] Linking static target lib/librte_vhost.a 00:01:51.678 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:51.678 [638/740] Linking static target lib/librte_pipeline.a 00:01:51.937 [639/740] Linking target app/dpdk-test-acl 00:01:51.937 [640/740] Linking target app/dpdk-test-cmdline 00:01:51.937 [641/740] Linking target app/dpdk-test-crypto-perf 00:01:51.937 [642/740] Linking target app/dpdk-test-security-perf 00:01:51.937 [643/740] Linking target app/dpdk-test-gpudev 00:01:51.937 [644/740] Linking target app/dpdk-proc-info 00:01:51.937 [645/740] Linking target app/dpdk-test-bbdev 00:01:51.937 [646/740] Linking target app/dpdk-test-eventdev 00:01:51.937 [647/740] Linking target app/dpdk-test-regex 00:01:51.937 [648/740] Linking target app/dpdk-test-sad 00:01:51.937 [649/740] Linking target app/dpdk-dumpcap 00:01:51.937 [650/740] Linking target app/dpdk-pdump 00:01:51.937 [651/740] Linking target app/dpdk-test-fib 00:01:51.937 [652/740] Linking target app/dpdk-test-compress-perf 00:01:51.937 [653/740] Linking target app/dpdk-test-pipeline 00:01:51.937 [654/740] Linking target app/dpdk-test-flow-perf 00:01:51.937 [655/740] Linking target app/dpdk-testpmd 00:01:53.314 [656/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.314 [657/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.314 [658/740] Linking target lib/librte_eal.so.23.0 00:01:53.572 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:01:53.572 [660/740] Linking target drivers/librte_bus_vdev.so.23.0 00:01:53.572 [661/740] Linking target lib/librte_meter.so.23.0 00:01:53.572 [662/740] Linking target lib/librte_timer.so.23.0 00:01:53.572 [663/740] Linking target lib/librte_ring.so.23.0 00:01:53.572 [664/740] Linking target lib/librte_jobstats.so.23.0 00:01:53.572 [665/740] Linking target lib/librte_pci.so.23.0 00:01:53.572 [666/740] Linking target lib/librte_cfgfile.so.23.0 00:01:53.572 [667/740] Linking target lib/librte_dmadev.so.23.0 00:01:53.572 [668/740] Linking target lib/librte_rawdev.so.23.0 00:01:53.572 [669/740] Linking target lib/librte_stack.so.23.0 00:01:53.572 [670/740] Linking target lib/librte_graph.so.23.0 00:01:53.572 [671/740] Linking target lib/librte_acl.so.23.0 00:01:53.572 [672/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:01:53.572 [673/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:01:53.572 [674/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:01:53.572 [675/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:01:53.830 [676/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:01:53.830 [677/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:01:53.830 [678/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:01:53.830 [679/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:01:53.830 [680/740] Linking target lib/librte_mempool.so.23.0 00:01:53.830 [681/740] Linking target lib/librte_rcu.so.23.0 00:01:53.830 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:01:53.830 [683/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:01:53.830 [684/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:01:53.830 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:01:53.830 [686/740] Linking target lib/librte_mbuf.so.23.0 00:01:53.830 [687/740] Linking target drivers/librte_mempool_ring.so.23.0 00:01:53.830 [688/740] Linking target lib/librte_rib.so.23.0 00:01:54.088 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:01:54.088 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:01:54.088 [691/740] Linking target lib/librte_fib.so.23.0 00:01:54.088 [692/740] Linking target lib/librte_sched.so.23.0 00:01:54.088 [693/740] Linking target lib/librte_bbdev.so.23.0 00:01:54.088 [694/740] Linking target lib/librte_gpudev.so.23.0 00:01:54.088 [695/740] Linking target lib/librte_reorder.so.23.0 00:01:54.088 [696/740] Linking target lib/librte_compressdev.so.23.0 00:01:54.088 [697/740] Linking target lib/librte_net.so.23.0 00:01:54.088 [698/740] Linking target lib/librte_distributor.so.23.0 00:01:54.088 [699/740] Linking target lib/librte_regexdev.so.23.0 00:01:54.088 [700/740] Linking target lib/librte_cryptodev.so.23.0 00:01:54.347 [701/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:01:54.347 [702/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:01:54.347 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:01:54.347 [704/740] Linking target lib/librte_cmdline.so.23.0 00:01:54.347 [705/740] Linking target lib/librte_hash.so.23.0 00:01:54.347 [706/740] Linking target lib/librte_security.so.23.0 00:01:54.347 [707/740] Linking target lib/librte_ethdev.so.23.0 00:01:54.347 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:01:54.347 [709/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:01:54.347 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:01:54.605 [711/740] Linking target lib/librte_efd.so.23.0 00:01:54.605 [712/740] Linking target lib/librte_lpm.so.23.0 00:01:54.605 [713/740] Linking target lib/librte_member.so.23.0 00:01:54.605 [714/740] Linking target lib/librte_ipsec.so.23.0 00:01:54.605 [715/740] Linking target lib/librte_metrics.so.23.0 00:01:54.605 [716/740] Linking target lib/librte_ip_frag.so.23.0 00:01:54.605 [717/740] Linking target lib/librte_gro.so.23.0 00:01:54.605 [718/740] Linking target lib/librte_bpf.so.23.0 00:01:54.605 [719/740] Linking target lib/librte_gso.so.23.0 00:01:54.605 [720/740] Linking target lib/librte_pcapng.so.23.0 00:01:54.606 [721/740] Linking target lib/librte_power.so.23.0 00:01:54.606 [722/740] Linking target lib/librte_eventdev.so.23.0 00:01:54.606 [723/740] Linking target lib/librte_vhost.so.23.0 00:01:54.606 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:01:54.606 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:01:54.606 [726/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:01:54.606 [727/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:01:54.606 [728/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:01:54.606 [729/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:01:54.606 [730/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:01:54.606 [731/740] Linking target lib/librte_node.so.23.0 00:01:54.606 [732/740] Linking target lib/librte_bitratestats.so.23.0 00:01:54.606 [733/740] Linking target lib/librte_latencystats.so.23.0 00:01:54.606 [734/740] Linking target lib/librte_port.so.23.0 00:01:54.606 [735/740] Linking target lib/librte_pdump.so.23.0 00:01:54.865 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:01:54.865 [737/740] Linking target lib/librte_table.so.23.0 00:01:55.124 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:01:56.503 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.503 [740/740] Linking target lib/librte_pipeline.so.23.0 00:01:56.503 07:27:07 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:56.503 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:56.503 [0/1] Installing files. 00:01:56.769 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.769 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.770 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:56.771 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.772 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.773 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:56.774 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:56.775 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:56.775 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:56.775 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.040 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.040 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:57.041 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:57.041 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:57.041 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.041 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:57.041 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.041 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.042 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.043 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.044 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:57.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:57.046 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:01:57.046 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:57.046 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:01:57.046 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:57.046 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:01:57.046 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:57.046 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:01:57.046 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:57.046 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:01:57.046 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:57.046 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:01:57.046 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:57.046 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:01:57.046 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:57.046 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:01:57.046 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:57.046 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:01:57.046 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:57.046 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:01:57.046 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:57.046 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:01:57.046 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:57.046 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:01:57.046 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:57.046 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:01:57.046 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:57.046 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:01:57.046 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:57.047 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:01:57.047 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:57.047 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:01:57.047 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:57.047 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:01:57.047 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:57.047 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:01:57.047 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:57.047 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:01:57.047 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:57.047 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:01:57.047 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:57.047 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:01:57.047 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:57.047 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:01:57.047 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:57.047 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:01:57.047 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:57.047 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:01:57.047 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:57.047 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:01:57.047 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:57.047 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:01:57.047 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:57.047 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:01:57.047 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:57.047 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:01:57.047 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:57.047 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:01:57.047 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:57.047 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:01:57.047 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:57.047 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:01:57.047 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:57.047 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:01:57.047 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:57.047 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:01:57.047 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:57.047 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:01:57.047 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:57.047 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:01:57.047 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:57.047 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:01:57.047 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:57.047 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:01:57.047 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:57.047 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:01:57.047 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:57.047 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:01:57.047 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:57.047 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:01:57.047 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:57.047 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:01:57.047 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:57.047 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:01:57.047 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:57.047 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:01:57.306 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:01:57.306 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:01:57.306 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:01:57.306 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:01:57.306 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:01:57.306 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:01:57.306 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:01:57.306 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:01:57.306 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:01:57.306 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:01:57.306 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:01:57.306 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:01:57.306 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:57.306 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:01:57.306 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:57.306 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:01:57.306 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:57.306 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:01:57.306 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:57.306 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:01:57.306 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:57.306 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:01:57.306 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:57.306 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:01:57.306 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:57.306 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:01:57.306 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:57.306 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:01:57.306 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:57.306 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:01:57.306 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:57.306 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:01:57.306 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:01:57.306 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:01:57.306 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:01:57.306 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:01:57.306 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:01:57.306 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:01:57.306 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:01:57.306 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:01:57.306 07:27:07 -- common/autobuild_common.sh@192 -- $ uname -s 00:01:57.306 07:27:07 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:57.306 07:27:07 -- common/autobuild_common.sh@203 -- $ cat 00:01:57.306 07:27:07 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:57.306 00:01:57.306 real 0m25.097s 00:01:57.306 user 6m35.037s 00:01:57.306 sys 2m12.353s 00:01:57.306 07:27:07 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:57.306 07:27:07 -- common/autotest_common.sh@10 -- $ set +x 00:01:57.306 ************************************ 00:01:57.306 END TEST build_native_dpdk 00:01:57.306 ************************************ 00:01:57.306 07:27:07 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:57.306 07:27:07 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:57.306 07:27:07 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:57.306 07:27:07 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:57.306 07:27:07 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:57.306 07:27:07 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:57.306 07:27:07 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:57.306 07:27:07 -- common/autotest_common.sh@10 -- $ set +x 00:01:57.306 ************************************ 00:01:57.306 START TEST autobuild_llvm_precompile 00:01:57.306 ************************************ 00:01:57.306 07:27:07 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:57.306 07:27:07 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:57.306 07:27:07 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:57.306 Target: x86_64-redhat-linux-gnu 00:01:57.306 Thread model: posix 00:01:57.306 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:57.306 07:27:07 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:57.306 07:27:07 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:57.306 07:27:07 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:57.306 07:27:07 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:57.306 07:27:07 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:57.307 07:27:07 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:57.307 07:27:07 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:57.307 07:27:07 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:57.307 07:27:07 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:57.307 07:27:07 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:57.565 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:57.565 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:57.565 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:57.824 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:58.083 Using 'verbs' RDMA provider 00:02:13.907 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:26.283 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:26.283 Creating mk/config.mk...done. 00:02:26.283 Creating mk/cc.flags.mk...done. 00:02:26.283 Type 'make' to build. 00:02:26.283 00:02:26.283 real 0m29.040s 00:02:26.283 user 0m12.629s 00:02:26.283 sys 0m15.813s 00:02:26.284 07:27:36 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:26.284 07:27:36 -- common/autotest_common.sh@10 -- $ set +x 00:02:26.284 ************************************ 00:02:26.284 END TEST autobuild_llvm_precompile 00:02:26.284 ************************************ 00:02:26.284 07:27:36 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:26.284 07:27:36 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:26.284 07:27:36 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:26.284 07:27:36 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:26.284 07:27:36 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:26.543 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:26.801 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:26.801 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.801 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:27.369 Using 'verbs' RDMA provider 00:02:40.148 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:52.360 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:52.360 Creating mk/config.mk...done. 00:02:52.360 Creating mk/cc.flags.mk...done. 00:02:52.360 Type 'make' to build. 00:02:52.360 07:28:01 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:52.360 07:28:01 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:52.360 07:28:01 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:52.360 07:28:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:52.360 ************************************ 00:02:52.360 START TEST make 00:02:52.360 ************************************ 00:02:52.360 07:28:01 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:52.360 make[1]: Nothing to be done for 'all'. 00:02:53.296 The Meson build system 00:02:53.296 Version: 1.5.0 00:02:53.296 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:53.296 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:53.296 Build type: native build 00:02:53.296 Project name: libvfio-user 00:02:53.296 Project version: 0.0.1 00:02:53.296 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:53.296 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:53.296 Host machine cpu family: x86_64 00:02:53.297 Host machine cpu: x86_64 00:02:53.297 Run-time dependency threads found: YES 00:02:53.297 Library dl found: YES 00:02:53.297 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:53.297 Run-time dependency json-c found: YES 0.17 00:02:53.297 Run-time dependency cmocka found: YES 1.1.7 00:02:53.297 Program pytest-3 found: NO 00:02:53.297 Program flake8 found: NO 00:02:53.297 Program misspell-fixer found: NO 00:02:53.297 Program restructuredtext-lint found: NO 00:02:53.297 Program valgrind found: YES (/usr/bin/valgrind) 00:02:53.297 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:53.297 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:53.297 Compiler for C supports arguments -Wwrite-strings: YES 00:02:53.297 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:53.297 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:53.297 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:53.297 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:53.297 Build targets in project: 8 00:02:53.297 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:53.297 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:53.297 00:02:53.297 libvfio-user 0.0.1 00:02:53.297 00:02:53.297 User defined options 00:02:53.297 buildtype : debug 00:02:53.297 default_library: static 00:02:53.297 libdir : /usr/local/lib 00:02:53.297 00:02:53.297 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:53.555 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:53.555 [1/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:53.555 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:53.555 [3/36] Compiling C object samples/null.p/null.c.o 00:02:53.555 [4/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:53.556 [5/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:53.556 [6/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:53.556 [7/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:53.556 [8/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:53.556 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:53.556 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:53.556 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:53.556 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:53.556 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:53.556 [14/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:53.556 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:53.556 [16/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:53.556 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:53.556 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:53.556 [19/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:53.556 [20/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:53.556 [21/36] Compiling C object samples/server.p/server.c.o 00:02:53.556 [22/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:53.556 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:53.556 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:53.556 [25/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:53.556 [26/36] Compiling C object samples/client.p/client.c.o 00:02:53.556 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:53.556 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:53.556 [29/36] Linking target samples/client 00:02:53.556 [30/36] Linking static target lib/libvfio-user.a 00:02:53.556 [31/36] Linking target samples/gpio-pci-idio-16 00:02:53.556 [32/36] Linking target test/unit_tests 00:02:53.556 [33/36] Linking target samples/lspci 00:02:53.556 [34/36] Linking target samples/server 00:02:53.815 [35/36] Linking target samples/shadow_ioeventfd_server 00:02:53.815 [36/36] Linking target samples/null 00:02:53.815 INFO: autodetecting backend as ninja 00:02:53.815 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:53.815 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:54.074 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:54.074 ninja: no work to do. 00:02:57.364 CC lib/ut/ut.o 00:02:57.364 CC lib/ut_mock/mock.o 00:02:57.364 CC lib/log/log.o 00:02:57.364 CC lib/log/log_flags.o 00:02:57.364 CC lib/log/log_deprecated.o 00:02:57.364 LIB libspdk_ut.a 00:02:57.364 LIB libspdk_ut_mock.a 00:02:57.364 LIB libspdk_log.a 00:02:57.364 CXX lib/trace_parser/trace.o 00:02:57.364 CC lib/dma/dma.o 00:02:57.364 CC lib/ioat/ioat.o 00:02:57.364 CC lib/util/cpuset.o 00:02:57.364 CC lib/util/base64.o 00:02:57.364 CC lib/util/bit_array.o 00:02:57.364 CC lib/util/crc16.o 00:02:57.364 CC lib/util/crc32.o 00:02:57.364 CC lib/util/crc32c.o 00:02:57.364 CC lib/util/crc32_ieee.o 00:02:57.364 CC lib/util/crc64.o 00:02:57.364 CC lib/util/dif.o 00:02:57.364 CC lib/util/file.o 00:02:57.364 CC lib/util/fd.o 00:02:57.364 CC lib/util/hexlify.o 00:02:57.365 CC lib/util/iov.o 00:02:57.365 CC lib/util/math.o 00:02:57.365 CC lib/util/pipe.o 00:02:57.365 CC lib/util/strerror_tls.o 00:02:57.365 CC lib/util/string.o 00:02:57.365 CC lib/util/uuid.o 00:02:57.365 CC lib/util/fd_group.o 00:02:57.365 CC lib/util/xor.o 00:02:57.365 CC lib/util/zipf.o 00:02:57.365 CC lib/vfio_user/host/vfio_user.o 00:02:57.365 CC lib/vfio_user/host/vfio_user_pci.o 00:02:57.365 LIB libspdk_dma.a 00:02:57.624 LIB libspdk_ioat.a 00:02:57.624 LIB libspdk_vfio_user.a 00:02:57.624 LIB libspdk_util.a 00:02:57.882 LIB libspdk_trace_parser.a 00:02:57.882 CC lib/vmd/vmd.o 00:02:57.882 CC lib/vmd/led.o 00:02:57.882 CC lib/env_dpdk/env.o 00:02:57.882 CC lib/env_dpdk/memory.o 00:02:57.882 CC lib/env_dpdk/threads.o 00:02:57.882 CC lib/env_dpdk/pci.o 00:02:57.882 CC lib/env_dpdk/init.o 00:02:57.882 CC lib/env_dpdk/pci_virtio.o 00:02:57.882 CC lib/env_dpdk/pci_idxd.o 00:02:57.882 CC lib/env_dpdk/pci_ioat.o 00:02:57.882 CC lib/conf/conf.o 00:02:57.882 CC lib/env_dpdk/pci_vmd.o 00:02:57.882 CC lib/env_dpdk/pci_event.o 00:02:57.882 CC lib/env_dpdk/sigbus_handler.o 00:02:57.882 CC lib/env_dpdk/pci_dpdk.o 00:02:57.882 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:57.882 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:57.882 CC lib/json/json_util.o 00:02:57.882 CC lib/json/json_parse.o 00:02:57.882 CC lib/idxd/idxd.o 00:02:57.882 CC lib/rdma/common.o 00:02:57.883 CC lib/idxd/idxd_user.o 00:02:57.883 CC lib/json/json_write.o 00:02:57.883 CC lib/idxd/idxd_kernel.o 00:02:57.883 CC lib/rdma/rdma_verbs.o 00:02:58.141 LIB libspdk_conf.a 00:02:58.141 LIB libspdk_json.a 00:02:58.141 LIB libspdk_rdma.a 00:02:58.400 LIB libspdk_idxd.a 00:02:58.400 LIB libspdk_vmd.a 00:02:58.400 CC lib/jsonrpc/jsonrpc_server.o 00:02:58.400 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:58.400 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:58.400 CC lib/jsonrpc/jsonrpc_client.o 00:02:58.659 LIB libspdk_jsonrpc.a 00:02:58.918 LIB libspdk_env_dpdk.a 00:02:58.918 CC lib/rpc/rpc.o 00:02:59.178 LIB libspdk_rpc.a 00:02:59.437 CC lib/trace/trace.o 00:02:59.437 CC lib/trace/trace_flags.o 00:02:59.437 CC lib/trace/trace_rpc.o 00:02:59.437 CC lib/notify/notify.o 00:02:59.437 CC lib/notify/notify_rpc.o 00:02:59.437 CC lib/sock/sock.o 00:02:59.437 CC lib/sock/sock_rpc.o 00:02:59.437 LIB libspdk_notify.a 00:02:59.437 LIB libspdk_trace.a 00:02:59.697 LIB libspdk_sock.a 00:02:59.955 CC lib/thread/thread.o 00:02:59.955 CC lib/thread/iobuf.o 00:02:59.955 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:59.955 CC lib/nvme/nvme_ctrlr.o 00:02:59.955 CC lib/nvme/nvme_fabric.o 00:02:59.955 CC lib/nvme/nvme_ns_cmd.o 00:02:59.955 CC lib/nvme/nvme_ns.o 00:02:59.955 CC lib/nvme/nvme_pcie_common.o 00:02:59.955 CC lib/nvme/nvme_pcie.o 00:02:59.955 CC lib/nvme/nvme_qpair.o 00:02:59.955 CC lib/nvme/nvme.o 00:02:59.955 CC lib/nvme/nvme_quirks.o 00:02:59.955 CC lib/nvme/nvme_transport.o 00:02:59.955 CC lib/nvme/nvme_discovery.o 00:02:59.955 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:59.955 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:59.955 CC lib/nvme/nvme_tcp.o 00:02:59.955 CC lib/nvme/nvme_opal.o 00:02:59.955 CC lib/nvme/nvme_io_msg.o 00:02:59.955 CC lib/nvme/nvme_poll_group.o 00:02:59.955 CC lib/nvme/nvme_cuse.o 00:02:59.955 CC lib/nvme/nvme_vfio_user.o 00:02:59.955 CC lib/nvme/nvme_zns.o 00:02:59.955 CC lib/nvme/nvme_rdma.o 00:03:00.521 LIB libspdk_thread.a 00:03:01.086 CC lib/init/json_config.o 00:03:01.087 CC lib/init/subsystem.o 00:03:01.087 CC lib/init/rpc.o 00:03:01.087 CC lib/init/subsystem_rpc.o 00:03:01.087 CC lib/blob/blobstore.o 00:03:01.087 CC lib/blob/request.o 00:03:01.087 CC lib/accel/accel.o 00:03:01.087 CC lib/blob/zeroes.o 00:03:01.087 CC lib/accel/accel_rpc.o 00:03:01.087 CC lib/virtio/virtio_vhost_user.o 00:03:01.087 CC lib/blob/blob_bs_dev.o 00:03:01.087 CC lib/accel/accel_sw.o 00:03:01.087 CC lib/virtio/virtio.o 00:03:01.087 CC lib/virtio/virtio_vfio_user.o 00:03:01.087 CC lib/virtio/virtio_pci.o 00:03:01.087 CC lib/vfu_tgt/tgt_endpoint.o 00:03:01.087 CC lib/vfu_tgt/tgt_rpc.o 00:03:01.087 LIB libspdk_init.a 00:03:01.087 LIB libspdk_virtio.a 00:03:01.087 LIB libspdk_vfu_tgt.a 00:03:01.344 LIB libspdk_nvme.a 00:03:01.344 CC lib/event/app.o 00:03:01.344 CC lib/event/reactor.o 00:03:01.344 CC lib/event/scheduler_static.o 00:03:01.344 CC lib/event/log_rpc.o 00:03:01.344 CC lib/event/app_rpc.o 00:03:01.602 LIB libspdk_accel.a 00:03:01.602 LIB libspdk_event.a 00:03:01.860 CC lib/bdev/bdev_rpc.o 00:03:01.860 CC lib/bdev/bdev.o 00:03:01.860 CC lib/bdev/part.o 00:03:01.860 CC lib/bdev/bdev_zone.o 00:03:01.860 CC lib/bdev/scsi_nvme.o 00:03:02.427 LIB libspdk_blob.a 00:03:02.686 CC lib/lvol/lvol.o 00:03:02.686 CC lib/blobfs/blobfs.o 00:03:02.686 CC lib/blobfs/tree.o 00:03:03.254 LIB libspdk_lvol.a 00:03:03.254 LIB libspdk_blobfs.a 00:03:03.514 LIB libspdk_bdev.a 00:03:03.772 CC lib/ftl/ftl_core.o 00:03:03.772 CC lib/ftl/ftl_init.o 00:03:03.772 CC lib/ftl/ftl_layout.o 00:03:03.772 CC lib/ftl/ftl_debug.o 00:03:03.772 CC lib/ftl/ftl_io.o 00:03:03.772 CC lib/ftl/ftl_sb.o 00:03:03.772 CC lib/ftl/ftl_l2p.o 00:03:03.772 CC lib/ftl/ftl_l2p_flat.o 00:03:03.772 CC lib/nvmf/ctrlr_bdev.o 00:03:03.772 CC lib/nvmf/ctrlr.o 00:03:03.772 CC lib/scsi/dev.o 00:03:03.772 CC lib/ftl/ftl_band.o 00:03:03.772 CC lib/scsi/lun.o 00:03:03.772 CC lib/nvmf/ctrlr_discovery.o 00:03:03.772 CC lib/ftl/ftl_nv_cache.o 00:03:03.772 CC lib/nvmf/subsystem.o 00:03:03.772 CC lib/scsi/port.o 00:03:03.772 CC lib/ftl/ftl_band_ops.o 00:03:03.772 CC lib/nvmf/nvmf.o 00:03:03.772 CC lib/scsi/scsi.o 00:03:03.772 CC lib/ftl/ftl_writer.o 00:03:03.772 CC lib/scsi/scsi_bdev.o 00:03:03.772 CC lib/nvmf/nvmf_rpc.o 00:03:03.772 CC lib/ftl/ftl_rq.o 00:03:03.772 CC lib/scsi/task.o 00:03:03.772 CC lib/scsi/scsi_pr.o 00:03:03.772 CC lib/nvmf/transport.o 00:03:03.772 CC lib/ftl/ftl_reloc.o 00:03:03.772 CC lib/scsi/scsi_rpc.o 00:03:03.772 CC lib/nvmf/tcp.o 00:03:03.772 CC lib/ftl/ftl_l2p_cache.o 00:03:03.772 CC lib/nbd/nbd.o 00:03:03.772 CC lib/nvmf/vfio_user.o 00:03:03.772 CC lib/ftl/ftl_p2l.o 00:03:03.772 CC lib/nbd/nbd_rpc.o 00:03:03.772 CC lib/ublk/ublk.o 00:03:03.772 CC lib/nvmf/rdma.o 00:03:03.772 CC lib/ublk/ublk_rpc.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:03.772 CC lib/ftl/utils/ftl_md.o 00:03:03.772 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:03.772 CC lib/ftl/utils/ftl_conf.o 00:03:03.772 CC lib/ftl/utils/ftl_mempool.o 00:03:03.772 CC lib/ftl/utils/ftl_bitmap.o 00:03:03.772 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:03.772 CC lib/ftl/utils/ftl_property.o 00:03:03.772 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:03.772 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:03.772 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:03.772 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:03.772 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:03.772 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:03.772 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:03.772 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:03.772 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:03.772 CC lib/ftl/base/ftl_base_dev.o 00:03:03.772 CC lib/ftl/base/ftl_base_bdev.o 00:03:03.772 CC lib/ftl/ftl_trace.o 00:03:04.031 LIB libspdk_nbd.a 00:03:04.289 LIB libspdk_scsi.a 00:03:04.289 LIB libspdk_ublk.a 00:03:04.289 LIB libspdk_ftl.a 00:03:04.547 CC lib/iscsi/iscsi.o 00:03:04.547 CC lib/iscsi/conn.o 00:03:04.547 CC lib/iscsi/init_grp.o 00:03:04.547 CC lib/iscsi/md5.o 00:03:04.547 CC lib/iscsi/tgt_node.o 00:03:04.547 CC lib/iscsi/param.o 00:03:04.547 CC lib/iscsi/iscsi_subsystem.o 00:03:04.547 CC lib/iscsi/portal_grp.o 00:03:04.547 CC lib/iscsi/iscsi_rpc.o 00:03:04.547 CC lib/iscsi/task.o 00:03:04.547 CC lib/vhost/vhost_rpc.o 00:03:04.547 CC lib/vhost/vhost.o 00:03:04.547 CC lib/vhost/vhost_scsi.o 00:03:04.547 CC lib/vhost/rte_vhost_user.o 00:03:04.547 CC lib/vhost/vhost_blk.o 00:03:05.115 LIB libspdk_nvmf.a 00:03:05.115 LIB libspdk_vhost.a 00:03:05.115 LIB libspdk_iscsi.a 00:03:05.683 CC module/env_dpdk/env_dpdk_rpc.o 00:03:05.683 CC module/vfu_device/vfu_virtio.o 00:03:05.683 CC module/vfu_device/vfu_virtio_blk.o 00:03:05.683 CC module/vfu_device/vfu_virtio_scsi.o 00:03:05.683 CC module/vfu_device/vfu_virtio_rpc.o 00:03:05.683 LIB libspdk_env_dpdk_rpc.a 00:03:05.683 CC module/blob/bdev/blob_bdev.o 00:03:05.683 CC module/accel/error/accel_error_rpc.o 00:03:05.683 CC module/accel/error/accel_error.o 00:03:05.683 CC module/scheduler/gscheduler/gscheduler.o 00:03:05.683 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:05.683 CC module/accel/ioat/accel_ioat.o 00:03:05.683 CC module/accel/ioat/accel_ioat_rpc.o 00:03:05.683 CC module/sock/posix/posix.o 00:03:05.683 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:05.683 CC module/accel/dsa/accel_dsa.o 00:03:05.683 CC module/accel/iaa/accel_iaa.o 00:03:05.683 CC module/accel/iaa/accel_iaa_rpc.o 00:03:05.683 CC module/accel/dsa/accel_dsa_rpc.o 00:03:05.942 LIB libspdk_accel_error.a 00:03:05.942 LIB libspdk_scheduler_gscheduler.a 00:03:05.942 LIB libspdk_scheduler_dpdk_governor.a 00:03:05.942 LIB libspdk_accel_ioat.a 00:03:05.942 LIB libspdk_blob_bdev.a 00:03:05.942 LIB libspdk_scheduler_dynamic.a 00:03:05.942 LIB libspdk_accel_iaa.a 00:03:05.942 LIB libspdk_accel_dsa.a 00:03:05.942 LIB libspdk_vfu_device.a 00:03:06.201 LIB libspdk_sock_posix.a 00:03:06.201 CC module/blobfs/bdev/blobfs_bdev.o 00:03:06.201 CC module/bdev/error/vbdev_error.o 00:03:06.201 CC module/bdev/error/vbdev_error_rpc.o 00:03:06.201 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:06.459 CC module/bdev/malloc/bdev_malloc.o 00:03:06.459 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:06.459 CC module/bdev/null/bdev_null.o 00:03:06.459 CC module/bdev/null/bdev_null_rpc.o 00:03:06.459 CC module/bdev/split/vbdev_split_rpc.o 00:03:06.459 CC module/bdev/split/vbdev_split.o 00:03:06.459 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:06.459 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:06.459 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:06.459 CC module/bdev/lvol/vbdev_lvol.o 00:03:06.459 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:06.459 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:06.459 CC module/bdev/passthru/vbdev_passthru.o 00:03:06.459 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:06.459 CC module/bdev/nvme/bdev_nvme.o 00:03:06.459 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:06.459 CC module/bdev/gpt/gpt.o 00:03:06.459 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:06.459 CC module/bdev/raid/bdev_raid_rpc.o 00:03:06.459 CC module/bdev/raid/bdev_raid.o 00:03:06.459 CC module/bdev/gpt/vbdev_gpt.o 00:03:06.459 CC module/bdev/nvme/bdev_mdns_client.o 00:03:06.459 CC module/bdev/delay/vbdev_delay.o 00:03:06.459 CC module/bdev/raid/raid0.o 00:03:06.459 CC module/bdev/nvme/nvme_rpc.o 00:03:06.459 CC module/bdev/ftl/bdev_ftl.o 00:03:06.459 CC module/bdev/raid/bdev_raid_sb.o 00:03:06.459 CC module/bdev/raid/raid1.o 00:03:06.459 CC module/bdev/raid/concat.o 00:03:06.459 CC module/bdev/nvme/vbdev_opal.o 00:03:06.459 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:06.459 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:06.459 CC module/bdev/aio/bdev_aio.o 00:03:06.459 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:06.459 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:06.459 CC module/bdev/iscsi/bdev_iscsi.o 00:03:06.459 CC module/bdev/aio/bdev_aio_rpc.o 00:03:06.459 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:06.459 LIB libspdk_blobfs_bdev.a 00:03:06.459 LIB libspdk_bdev_split.a 00:03:06.459 LIB libspdk_bdev_error.a 00:03:06.459 LIB libspdk_bdev_null.a 00:03:06.459 LIB libspdk_bdev_ftl.a 00:03:06.459 LIB libspdk_bdev_gpt.a 00:03:06.459 LIB libspdk_bdev_passthru.a 00:03:06.718 LIB libspdk_bdev_aio.a 00:03:06.718 LIB libspdk_bdev_malloc.a 00:03:06.719 LIB libspdk_bdev_zone_block.a 00:03:06.719 LIB libspdk_bdev_iscsi.a 00:03:06.719 LIB libspdk_bdev_delay.a 00:03:06.719 LIB libspdk_bdev_lvol.a 00:03:06.719 LIB libspdk_bdev_virtio.a 00:03:06.978 LIB libspdk_bdev_raid.a 00:03:07.545 LIB libspdk_bdev_nvme.a 00:03:08.112 CC module/event/subsystems/vmd/vmd.o 00:03:08.112 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:08.112 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:08.112 CC module/event/subsystems/scheduler/scheduler.o 00:03:08.112 CC module/event/subsystems/sock/sock.o 00:03:08.112 CC module/event/subsystems/iobuf/iobuf.o 00:03:08.112 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:08.112 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:08.112 LIB libspdk_event_vfu_tgt.a 00:03:08.112 LIB libspdk_event_vmd.a 00:03:08.112 LIB libspdk_event_sock.a 00:03:08.112 LIB libspdk_event_scheduler.a 00:03:08.112 LIB libspdk_event_iobuf.a 00:03:08.112 LIB libspdk_event_vhost_blk.a 00:03:08.680 CC module/event/subsystems/accel/accel.o 00:03:08.681 LIB libspdk_event_accel.a 00:03:08.940 CC module/event/subsystems/bdev/bdev.o 00:03:08.940 LIB libspdk_event_bdev.a 00:03:09.508 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:09.508 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:09.508 CC module/event/subsystems/ublk/ublk.o 00:03:09.508 CC module/event/subsystems/scsi/scsi.o 00:03:09.508 CC module/event/subsystems/nbd/nbd.o 00:03:09.508 LIB libspdk_event_nbd.a 00:03:09.508 LIB libspdk_event_ublk.a 00:03:09.509 LIB libspdk_event_scsi.a 00:03:09.509 LIB libspdk_event_nvmf.a 00:03:09.767 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:09.767 CC module/event/subsystems/iscsi/iscsi.o 00:03:09.767 LIB libspdk_event_vhost_scsi.a 00:03:09.767 LIB libspdk_event_iscsi.a 00:03:10.335 CXX app/trace/trace.o 00:03:10.335 CC app/spdk_lspci/spdk_lspci.o 00:03:10.335 TEST_HEADER include/spdk/accel.h 00:03:10.335 TEST_HEADER include/spdk/assert.h 00:03:10.335 TEST_HEADER include/spdk/barrier.h 00:03:10.335 TEST_HEADER include/spdk/accel_module.h 00:03:10.335 CC app/spdk_nvme_identify/identify.o 00:03:10.335 TEST_HEADER include/spdk/base64.h 00:03:10.335 TEST_HEADER include/spdk/bdev_module.h 00:03:10.335 TEST_HEADER include/spdk/bdev.h 00:03:10.335 TEST_HEADER include/spdk/bdev_zone.h 00:03:10.335 CC app/trace_record/trace_record.o 00:03:10.335 TEST_HEADER include/spdk/bit_array.h 00:03:10.335 TEST_HEADER include/spdk/bit_pool.h 00:03:10.335 TEST_HEADER include/spdk/blob_bdev.h 00:03:10.335 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:10.335 TEST_HEADER include/spdk/blobfs.h 00:03:10.335 CC app/spdk_nvme_discover/discovery_aer.o 00:03:10.335 TEST_HEADER include/spdk/blob.h 00:03:10.335 TEST_HEADER include/spdk/conf.h 00:03:10.335 CC app/spdk_nvme_perf/perf.o 00:03:10.335 TEST_HEADER include/spdk/cpuset.h 00:03:10.335 TEST_HEADER include/spdk/config.h 00:03:10.335 TEST_HEADER include/spdk/crc16.h 00:03:10.335 TEST_HEADER include/spdk/crc32.h 00:03:10.335 CC test/rpc_client/rpc_client_test.o 00:03:10.335 TEST_HEADER include/spdk/crc64.h 00:03:10.335 TEST_HEADER include/spdk/dif.h 00:03:10.335 TEST_HEADER include/spdk/dma.h 00:03:10.335 CC app/spdk_top/spdk_top.o 00:03:10.335 TEST_HEADER include/spdk/endian.h 00:03:10.335 TEST_HEADER include/spdk/env_dpdk.h 00:03:10.335 TEST_HEADER include/spdk/env.h 00:03:10.335 TEST_HEADER include/spdk/event.h 00:03:10.335 TEST_HEADER include/spdk/fd_group.h 00:03:10.335 TEST_HEADER include/spdk/fd.h 00:03:10.335 TEST_HEADER include/spdk/file.h 00:03:10.335 TEST_HEADER include/spdk/ftl.h 00:03:10.335 TEST_HEADER include/spdk/gpt_spec.h 00:03:10.335 TEST_HEADER include/spdk/hexlify.h 00:03:10.335 TEST_HEADER include/spdk/histogram_data.h 00:03:10.335 TEST_HEADER include/spdk/idxd.h 00:03:10.335 TEST_HEADER include/spdk/idxd_spec.h 00:03:10.335 TEST_HEADER include/spdk/init.h 00:03:10.335 TEST_HEADER include/spdk/ioat.h 00:03:10.335 TEST_HEADER include/spdk/ioat_spec.h 00:03:10.335 TEST_HEADER include/spdk/iscsi_spec.h 00:03:10.335 TEST_HEADER include/spdk/json.h 00:03:10.335 TEST_HEADER include/spdk/jsonrpc.h 00:03:10.335 TEST_HEADER include/spdk/log.h 00:03:10.335 TEST_HEADER include/spdk/likely.h 00:03:10.335 TEST_HEADER include/spdk/lvol.h 00:03:10.335 CC app/spdk_dd/spdk_dd.o 00:03:10.335 CC app/iscsi_tgt/iscsi_tgt.o 00:03:10.335 TEST_HEADER include/spdk/memory.h 00:03:10.335 TEST_HEADER include/spdk/mmio.h 00:03:10.335 TEST_HEADER include/spdk/nbd.h 00:03:10.335 TEST_HEADER include/spdk/notify.h 00:03:10.335 TEST_HEADER include/spdk/nvme.h 00:03:10.335 TEST_HEADER include/spdk/nvme_intel.h 00:03:10.335 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:10.335 TEST_HEADER include/spdk/nvme_spec.h 00:03:10.335 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:10.335 TEST_HEADER include/spdk/nvme_zns.h 00:03:10.335 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:10.335 TEST_HEADER include/spdk/nvmf.h 00:03:10.335 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:10.335 TEST_HEADER include/spdk/nvmf_spec.h 00:03:10.335 TEST_HEADER include/spdk/nvmf_transport.h 00:03:10.335 TEST_HEADER include/spdk/opal.h 00:03:10.335 TEST_HEADER include/spdk/opal_spec.h 00:03:10.335 TEST_HEADER include/spdk/pci_ids.h 00:03:10.335 TEST_HEADER include/spdk/pipe.h 00:03:10.335 TEST_HEADER include/spdk/queue.h 00:03:10.335 CC app/vhost/vhost.o 00:03:10.335 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:10.335 TEST_HEADER include/spdk/rpc.h 00:03:10.335 TEST_HEADER include/spdk/reduce.h 00:03:10.335 TEST_HEADER include/spdk/scheduler.h 00:03:10.335 TEST_HEADER include/spdk/scsi.h 00:03:10.335 TEST_HEADER include/spdk/scsi_spec.h 00:03:10.335 TEST_HEADER include/spdk/sock.h 00:03:10.335 TEST_HEADER include/spdk/stdinc.h 00:03:10.335 TEST_HEADER include/spdk/string.h 00:03:10.335 TEST_HEADER include/spdk/thread.h 00:03:10.335 TEST_HEADER include/spdk/trace.h 00:03:10.335 TEST_HEADER include/spdk/trace_parser.h 00:03:10.335 TEST_HEADER include/spdk/tree.h 00:03:10.335 TEST_HEADER include/spdk/util.h 00:03:10.335 TEST_HEADER include/spdk/ublk.h 00:03:10.335 TEST_HEADER include/spdk/uuid.h 00:03:10.335 TEST_HEADER include/spdk/version.h 00:03:10.335 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:10.335 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:10.335 TEST_HEADER include/spdk/vhost.h 00:03:10.335 TEST_HEADER include/spdk/vmd.h 00:03:10.335 TEST_HEADER include/spdk/xor.h 00:03:10.335 CXX test/cpp_headers/accel.o 00:03:10.335 TEST_HEADER include/spdk/zipf.h 00:03:10.335 CXX test/cpp_headers/accel_module.o 00:03:10.335 CXX test/cpp_headers/base64.o 00:03:10.335 CXX test/cpp_headers/assert.o 00:03:10.335 CXX test/cpp_headers/barrier.o 00:03:10.335 CC app/nvmf_tgt/nvmf_main.o 00:03:10.335 CXX test/cpp_headers/bdev.o 00:03:10.335 CXX test/cpp_headers/bdev_module.o 00:03:10.335 CXX test/cpp_headers/bdev_zone.o 00:03:10.335 CC app/spdk_tgt/spdk_tgt.o 00:03:10.335 CXX test/cpp_headers/bit_array.o 00:03:10.335 CXX test/cpp_headers/bit_pool.o 00:03:10.335 CXX test/cpp_headers/blob_bdev.o 00:03:10.335 CXX test/cpp_headers/blobfs_bdev.o 00:03:10.335 CXX test/cpp_headers/blobfs.o 00:03:10.335 CXX test/cpp_headers/blob.o 00:03:10.335 CXX test/cpp_headers/conf.o 00:03:10.335 CXX test/cpp_headers/config.o 00:03:10.335 CXX test/cpp_headers/cpuset.o 00:03:10.335 CXX test/cpp_headers/crc16.o 00:03:10.335 CXX test/cpp_headers/crc32.o 00:03:10.335 CXX test/cpp_headers/crc64.o 00:03:10.335 CXX test/cpp_headers/dif.o 00:03:10.335 CXX test/cpp_headers/dma.o 00:03:10.335 CXX test/cpp_headers/endian.o 00:03:10.335 CXX test/cpp_headers/env_dpdk.o 00:03:10.335 CXX test/cpp_headers/env.o 00:03:10.335 CXX test/cpp_headers/event.o 00:03:10.335 CXX test/cpp_headers/fd_group.o 00:03:10.335 CXX test/cpp_headers/fd.o 00:03:10.335 CXX test/cpp_headers/file.o 00:03:10.335 CXX test/cpp_headers/ftl.o 00:03:10.335 CXX test/cpp_headers/gpt_spec.o 00:03:10.335 CXX test/cpp_headers/hexlify.o 00:03:10.335 CXX test/cpp_headers/histogram_data.o 00:03:10.335 CXX test/cpp_headers/idxd.o 00:03:10.335 CXX test/cpp_headers/idxd_spec.o 00:03:10.335 CXX test/cpp_headers/init.o 00:03:10.335 CC examples/nvme/abort/abort.o 00:03:10.335 CC test/thread/lock/spdk_lock.o 00:03:10.335 CC examples/nvme/hotplug/hotplug.o 00:03:10.335 CC test/app/stub/stub.o 00:03:10.335 CC examples/nvme/arbitration/arbitration.o 00:03:10.335 CC examples/vmd/lsvmd/lsvmd.o 00:03:10.335 CC examples/accel/perf/accel_perf.o 00:03:10.335 CC test/app/histogram_perf/histogram_perf.o 00:03:10.335 CC examples/vmd/led/led.o 00:03:10.335 CC test/thread/poller_perf/poller_perf.o 00:03:10.335 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:10.336 CC examples/sock/hello_world/hello_sock.o 00:03:10.336 CC test/env/vtophys/vtophys.o 00:03:10.336 CC test/nvme/connect_stress/connect_stress.o 00:03:10.336 CC examples/nvme/hello_world/hello_world.o 00:03:10.336 CC test/env/pci/pci_ut.o 00:03:10.336 CC examples/ioat/perf/perf.o 00:03:10.336 CC examples/nvme/reconnect/reconnect.o 00:03:10.336 CC test/nvme/aer/aer.o 00:03:10.336 CC test/event/event_perf/event_perf.o 00:03:10.336 CC test/nvme/err_injection/err_injection.o 00:03:10.336 CC test/app/jsoncat/jsoncat.o 00:03:10.336 CC test/nvme/sgl/sgl.o 00:03:10.336 CC test/nvme/reset/reset.o 00:03:10.336 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:10.336 CC test/event/reactor_perf/reactor_perf.o 00:03:10.336 CC test/env/memory/memory_ut.o 00:03:10.336 CC test/nvme/startup/startup.o 00:03:10.336 CC test/nvme/simple_copy/simple_copy.o 00:03:10.336 CC test/nvme/overhead/overhead.o 00:03:10.336 CC test/event/reactor/reactor.o 00:03:10.336 CC test/nvme/reserve/reserve.o 00:03:10.336 CC test/nvme/fused_ordering/fused_ordering.o 00:03:10.336 CC test/nvme/compliance/nvme_compliance.o 00:03:10.336 CC examples/ioat/verify/verify.o 00:03:10.336 CC examples/util/zipf/zipf.o 00:03:10.336 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:10.336 LINK spdk_lspci 00:03:10.336 CC test/nvme/e2edp/nvme_dp.o 00:03:10.336 CC test/nvme/cuse/cuse.o 00:03:10.336 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:10.336 CC app/fio/nvme/fio_plugin.o 00:03:10.336 CC test/nvme/fdp/fdp.o 00:03:10.336 CC examples/idxd/perf/perf.o 00:03:10.336 CC test/nvme/boot_partition/boot_partition.o 00:03:10.336 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:10.336 CC test/event/app_repeat/app_repeat.o 00:03:10.336 CC test/app/bdev_svc/bdev_svc.o 00:03:10.336 CC test/dma/test_dma/test_dma.o 00:03:10.336 CC test/bdev/bdevio/bdevio.o 00:03:10.336 CC examples/thread/thread/thread_ex.o 00:03:10.336 CC test/blobfs/mkfs/mkfs.o 00:03:10.336 CC examples/blob/hello_world/hello_blob.o 00:03:10.336 CC test/accel/dif/dif.o 00:03:10.336 CC examples/bdev/hello_world/hello_bdev.o 00:03:10.336 CC examples/blob/cli/blobcli.o 00:03:10.336 CC examples/nvmf/nvmf/nvmf.o 00:03:10.336 CC examples/bdev/bdevperf/bdevperf.o 00:03:10.336 CC app/fio/bdev/fio_plugin.o 00:03:10.336 CC test/event/scheduler/scheduler.o 00:03:10.336 CC test/env/mem_callbacks/mem_callbacks.o 00:03:10.336 CC test/lvol/esnap/esnap.o 00:03:10.336 LINK rpc_client_test 00:03:10.336 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:10.336 LINK spdk_nvme_discover 00:03:10.602 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:10.602 LINK spdk_trace_record 00:03:10.602 CXX test/cpp_headers/ioat.o 00:03:10.602 CXX test/cpp_headers/ioat_spec.o 00:03:10.602 CXX test/cpp_headers/iscsi_spec.o 00:03:10.602 LINK lsvmd 00:03:10.602 LINK vhost 00:03:10.602 CXX test/cpp_headers/json.o 00:03:10.602 LINK interrupt_tgt 00:03:10.602 CXX test/cpp_headers/jsonrpc.o 00:03:10.602 CXX test/cpp_headers/likely.o 00:03:10.602 CXX test/cpp_headers/log.o 00:03:10.602 CXX test/cpp_headers/lvol.o 00:03:10.602 CXX test/cpp_headers/memory.o 00:03:10.602 CXX test/cpp_headers/mmio.o 00:03:10.602 CXX test/cpp_headers/nbd.o 00:03:10.602 CXX test/cpp_headers/notify.o 00:03:10.602 CXX test/cpp_headers/nvme.o 00:03:10.602 LINK histogram_perf 00:03:10.603 LINK led 00:03:10.603 LINK vtophys 00:03:10.603 LINK jsoncat 00:03:10.603 CXX test/cpp_headers/nvme_intel.o 00:03:10.603 CXX test/cpp_headers/nvme_ocssd.o 00:03:10.603 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:10.603 LINK iscsi_tgt 00:03:10.603 CXX test/cpp_headers/nvme_spec.o 00:03:10.603 CXX test/cpp_headers/nvme_zns.o 00:03:10.603 LINK poller_perf 00:03:10.603 CXX test/cpp_headers/nvmf_cmd.o 00:03:10.603 LINK reactor 00:03:10.603 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:10.603 CXX test/cpp_headers/nvmf.o 00:03:10.603 CXX test/cpp_headers/nvmf_spec.o 00:03:10.603 CXX test/cpp_headers/nvmf_transport.o 00:03:10.603 CXX test/cpp_headers/opal.o 00:03:10.603 LINK nvmf_tgt 00:03:10.603 CXX test/cpp_headers/opal_spec.o 00:03:10.603 LINK reactor_perf 00:03:10.603 CXX test/cpp_headers/pci_ids.o 00:03:10.603 LINK event_perf 00:03:10.603 CXX test/cpp_headers/pipe.o 00:03:10.603 CXX test/cpp_headers/queue.o 00:03:10.603 LINK zipf 00:03:10.603 CXX test/cpp_headers/reduce.o 00:03:10.603 CXX test/cpp_headers/rpc.o 00:03:10.603 CXX test/cpp_headers/scheduler.o 00:03:10.603 LINK env_dpdk_post_init 00:03:10.603 CXX test/cpp_headers/scsi.o 00:03:10.603 CXX test/cpp_headers/scsi_spec.o 00:03:10.603 LINK connect_stress 00:03:10.603 LINK startup 00:03:10.603 LINK stub 00:03:10.603 LINK app_repeat 00:03:10.603 LINK boot_partition 00:03:10.603 LINK err_injection 00:03:10.603 LINK spdk_tgt 00:03:10.603 LINK pmr_persistence 00:03:10.603 LINK fused_ordering 00:03:10.603 LINK doorbell_aers 00:03:10.603 CXX test/cpp_headers/sock.o 00:03:10.603 LINK reserve 00:03:10.603 LINK hotplug 00:03:10.603 LINK cmb_copy 00:03:10.603 LINK verify 00:03:10.603 LINK ioat_perf 00:03:10.603 LINK bdev_svc 00:03:10.603 LINK hello_world 00:03:10.603 LINK simple_copy 00:03:10.603 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:10.603 LINK hello_sock 00:03:10.603 LINK mkfs 00:03:10.603 LINK aer 00:03:10.603 LINK sgl 00:03:10.603 LINK overhead 00:03:10.603 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:10.603 LINK fdp 00:03:10.603 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:10.603 LINK reset 00:03:10.603 LINK nvme_dp 00:03:10.603 LINK hello_blob 00:03:10.603 LINK hello_bdev 00:03:10.603 LINK spdk_trace 00:03:10.603 LINK scheduler 00:03:10.603 LINK mem_callbacks 00:03:10.865 CXX test/cpp_headers/stdinc.o 00:03:10.865 CXX test/cpp_headers/string.o 00:03:10.865 LINK thread 00:03:10.865 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:10.865 CXX test/cpp_headers/thread.o 00:03:10.865 CXX test/cpp_headers/trace.o 00:03:10.865 CXX test/cpp_headers/trace_parser.o 00:03:10.865 CXX test/cpp_headers/tree.o 00:03:10.865 CXX test/cpp_headers/ublk.o 00:03:10.865 CXX test/cpp_headers/util.o 00:03:10.865 CXX test/cpp_headers/uuid.o 00:03:10.865 CXX test/cpp_headers/version.o 00:03:10.865 CXX test/cpp_headers/vfio_user_pci.o 00:03:10.865 CXX test/cpp_headers/vfio_user_spec.o 00:03:10.865 CXX test/cpp_headers/vhost.o 00:03:10.865 CXX test/cpp_headers/vmd.o 00:03:10.865 CXX test/cpp_headers/xor.o 00:03:10.865 LINK nvmf 00:03:10.865 LINK arbitration 00:03:10.865 CXX test/cpp_headers/zipf.o 00:03:10.865 LINK idxd_perf 00:03:10.865 LINK reconnect 00:03:10.865 LINK abort 00:03:10.865 LINK spdk_dd 00:03:10.865 LINK bdevio 00:03:10.865 LINK test_dma 00:03:10.865 LINK dif 00:03:10.865 LINK nvme_manage 00:03:10.865 LINK pci_ut 00:03:10.865 LINK accel_perf 00:03:10.865 LINK nvme_compliance 00:03:10.865 LINK nvme_fuzz 00:03:11.125 LINK blobcli 00:03:11.125 LINK spdk_nvme 00:03:11.125 LINK llvm_vfio_fuzz 00:03:11.125 LINK memory_ut 00:03:11.125 LINK spdk_nvme_identify 00:03:11.125 LINK spdk_bdev 00:03:11.125 LINK spdk_top 00:03:11.383 LINK spdk_nvme_perf 00:03:11.383 LINK vhost_fuzz 00:03:11.383 LINK bdevperf 00:03:11.641 LINK llvm_nvme_fuzz 00:03:11.641 LINK cuse 00:03:11.899 LINK spdk_lock 00:03:12.157 LINK iscsi_fuzz 00:03:14.061 LINK esnap 00:03:14.321 00:03:14.321 real 0m23.209s 00:03:14.321 user 4m13.352s 00:03:14.321 sys 2m2.439s 00:03:14.321 07:28:24 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:14.321 07:28:24 -- common/autotest_common.sh@10 -- $ set +x 00:03:14.321 ************************************ 00:03:14.321 END TEST make 00:03:14.321 ************************************ 00:03:14.580 07:28:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:14.580 07:28:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:14.580 07:28:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:14.580 07:28:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:14.580 07:28:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:14.580 07:28:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:14.580 07:28:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:14.580 07:28:25 -- scripts/common.sh@335 -- # IFS=.-: 00:03:14.580 07:28:25 -- scripts/common.sh@335 -- # read -ra ver1 00:03:14.580 07:28:25 -- scripts/common.sh@336 -- # IFS=.-: 00:03:14.580 07:28:25 -- scripts/common.sh@336 -- # read -ra ver2 00:03:14.580 07:28:25 -- scripts/common.sh@337 -- # local 'op=<' 00:03:14.580 07:28:25 -- scripts/common.sh@339 -- # ver1_l=2 00:03:14.580 07:28:25 -- scripts/common.sh@340 -- # ver2_l=1 00:03:14.580 07:28:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:14.580 07:28:25 -- scripts/common.sh@343 -- # case "$op" in 00:03:14.580 07:28:25 -- scripts/common.sh@344 -- # : 1 00:03:14.580 07:28:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:14.580 07:28:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:14.580 07:28:25 -- scripts/common.sh@364 -- # decimal 1 00:03:14.580 07:28:25 -- scripts/common.sh@352 -- # local d=1 00:03:14.580 07:28:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:14.580 07:28:25 -- scripts/common.sh@354 -- # echo 1 00:03:14.580 07:28:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:14.580 07:28:25 -- scripts/common.sh@365 -- # decimal 2 00:03:14.580 07:28:25 -- scripts/common.sh@352 -- # local d=2 00:03:14.580 07:28:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:14.580 07:28:25 -- scripts/common.sh@354 -- # echo 2 00:03:14.580 07:28:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:14.580 07:28:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:14.581 07:28:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:14.581 07:28:25 -- scripts/common.sh@367 -- # return 0 00:03:14.581 07:28:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:14.581 07:28:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:14.581 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:14.581 --rc genhtml_branch_coverage=1 00:03:14.581 --rc genhtml_function_coverage=1 00:03:14.581 --rc genhtml_legend=1 00:03:14.581 --rc geninfo_all_blocks=1 00:03:14.581 --rc geninfo_unexecuted_blocks=1 00:03:14.581 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:14.581 ' 00:03:14.581 07:28:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:14.581 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:14.581 --rc genhtml_branch_coverage=1 00:03:14.581 --rc genhtml_function_coverage=1 00:03:14.581 --rc genhtml_legend=1 00:03:14.581 --rc geninfo_all_blocks=1 00:03:14.581 --rc geninfo_unexecuted_blocks=1 00:03:14.581 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:14.581 ' 00:03:14.581 07:28:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:14.581 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:14.581 --rc genhtml_branch_coverage=1 00:03:14.581 --rc genhtml_function_coverage=1 00:03:14.581 --rc genhtml_legend=1 00:03:14.581 --rc geninfo_all_blocks=1 00:03:14.581 --rc geninfo_unexecuted_blocks=1 00:03:14.581 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:14.581 ' 00:03:14.581 07:28:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:14.581 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:14.581 --rc genhtml_branch_coverage=1 00:03:14.581 --rc genhtml_function_coverage=1 00:03:14.581 --rc genhtml_legend=1 00:03:14.581 --rc geninfo_all_blocks=1 00:03:14.581 --rc geninfo_unexecuted_blocks=1 00:03:14.581 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:14.581 ' 00:03:14.581 07:28:25 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:14.581 07:28:25 -- nvmf/common.sh@7 -- # uname -s 00:03:14.581 07:28:25 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:14.581 07:28:25 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:14.581 07:28:25 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:14.581 07:28:25 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:14.581 07:28:25 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:14.581 07:28:25 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:14.581 07:28:25 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:14.581 07:28:25 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:14.581 07:28:25 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:14.581 07:28:25 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:14.581 07:28:25 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:14.581 07:28:25 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:14.581 07:28:25 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:14.581 07:28:25 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:14.581 07:28:25 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:14.581 07:28:25 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:14.581 07:28:25 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:14.581 07:28:25 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:14.581 07:28:25 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:14.581 07:28:25 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:14.581 07:28:25 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:14.581 07:28:25 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:14.581 07:28:25 -- paths/export.sh@5 -- # export PATH 00:03:14.581 07:28:25 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:14.581 07:28:25 -- nvmf/common.sh@46 -- # : 0 00:03:14.581 07:28:25 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:14.581 07:28:25 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:14.581 07:28:25 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:14.581 07:28:25 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:14.581 07:28:25 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:14.581 07:28:25 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:14.581 07:28:25 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:14.581 07:28:25 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:14.581 07:28:25 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:14.581 07:28:25 -- spdk/autotest.sh@32 -- # uname -s 00:03:14.581 07:28:25 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:14.581 07:28:25 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:14.581 07:28:25 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:14.581 07:28:25 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:14.581 07:28:25 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:14.581 07:28:25 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:14.581 07:28:25 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:14.581 07:28:25 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:14.581 07:28:25 -- spdk/autotest.sh@48 -- # udevadm_pid=1594067 00:03:14.581 07:28:25 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:14.581 07:28:25 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:14.581 07:28:25 -- spdk/autotest.sh@54 -- # echo 1594069 00:03:14.581 07:28:25 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:14.581 07:28:25 -- spdk/autotest.sh@56 -- # echo 1594070 00:03:14.581 07:28:25 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:14.581 07:28:25 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:14.581 07:28:25 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:14.581 07:28:25 -- spdk/autotest.sh@60 -- # echo 1594071 00:03:14.581 07:28:25 -- spdk/autotest.sh@62 -- # echo 1594072 00:03:14.581 07:28:25 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:14.581 07:28:25 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:14.581 07:28:25 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:14.581 07:28:25 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:14.581 07:28:25 -- common/autotest_common.sh@10 -- # set +x 00:03:14.581 07:28:25 -- spdk/autotest.sh@70 -- # create_test_list 00:03:14.581 07:28:25 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:14.581 07:28:25 -- common/autotest_common.sh@10 -- # set +x 00:03:14.581 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:14.581 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:14.581 07:28:25 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:14.581 07:28:25 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:14.581 07:28:25 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:14.581 07:28:25 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:14.581 07:28:25 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:14.581 07:28:25 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:14.581 07:28:25 -- common/autotest_common.sh@1450 -- # uname 00:03:14.581 07:28:25 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:14.581 07:28:25 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:14.581 07:28:25 -- common/autotest_common.sh@1470 -- # uname 00:03:14.840 07:28:25 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:14.840 07:28:25 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:14.840 07:28:25 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:14.840 lcov: LCOV version 1.15 00:03:14.840 07:28:25 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:22.967 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:22.967 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:22.967 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:29.594 07:28:39 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:29.594 07:28:39 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:29.594 07:28:39 -- common/autotest_common.sh@10 -- # set +x 00:03:29.594 07:28:39 -- spdk/autotest.sh@89 -- # rm -f 00:03:29.594 07:28:39 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:32.161 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:32.161 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:32.161 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:32.161 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:32.161 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:32.419 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:32.420 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:32.420 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:32.420 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:32.678 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:32.678 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:32.678 07:28:43 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:32.678 07:28:43 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:32.678 07:28:43 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:32.678 07:28:43 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:32.678 07:28:43 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:32.678 07:28:43 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:32.678 07:28:43 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:32.678 07:28:43 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:32.678 07:28:43 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:32.678 07:28:43 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:32.678 07:28:43 -- spdk/autotest.sh@108 -- # grep -v p 00:03:32.678 07:28:43 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:32.678 07:28:43 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:32.678 07:28:43 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:32.678 07:28:43 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:32.678 07:28:43 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:32.679 07:28:43 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:32.679 No valid GPT data, bailing 00:03:32.679 07:28:43 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:32.679 07:28:43 -- scripts/common.sh@393 -- # pt= 00:03:32.679 07:28:43 -- scripts/common.sh@394 -- # return 1 00:03:32.679 07:28:43 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:32.679 1+0 records in 00:03:32.679 1+0 records out 00:03:32.679 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00186897 s, 561 MB/s 00:03:32.679 07:28:43 -- spdk/autotest.sh@116 -- # sync 00:03:32.679 07:28:43 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:32.679 07:28:43 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:32.679 07:28:43 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:39.252 07:28:49 -- spdk/autotest.sh@122 -- # uname -s 00:03:39.252 07:28:49 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:39.252 07:28:49 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:39.252 07:28:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:39.252 07:28:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:39.252 07:28:49 -- common/autotest_common.sh@10 -- # set +x 00:03:39.252 ************************************ 00:03:39.252 START TEST setup.sh 00:03:39.252 ************************************ 00:03:39.252 07:28:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:39.252 * Looking for test storage... 00:03:39.252 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:39.252 07:28:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:39.252 07:28:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:39.252 07:28:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:39.512 07:28:50 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:39.512 07:28:50 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:39.512 07:28:50 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:39.512 07:28:50 -- scripts/common.sh@335 -- # IFS=.-: 00:03:39.512 07:28:50 -- scripts/common.sh@335 -- # read -ra ver1 00:03:39.512 07:28:50 -- scripts/common.sh@336 -- # IFS=.-: 00:03:39.512 07:28:50 -- scripts/common.sh@336 -- # read -ra ver2 00:03:39.512 07:28:50 -- scripts/common.sh@337 -- # local 'op=<' 00:03:39.512 07:28:50 -- scripts/common.sh@339 -- # ver1_l=2 00:03:39.512 07:28:50 -- scripts/common.sh@340 -- # ver2_l=1 00:03:39.512 07:28:50 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:39.512 07:28:50 -- scripts/common.sh@343 -- # case "$op" in 00:03:39.512 07:28:50 -- scripts/common.sh@344 -- # : 1 00:03:39.512 07:28:50 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:39.512 07:28:50 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:39.512 07:28:50 -- scripts/common.sh@364 -- # decimal 1 00:03:39.512 07:28:50 -- scripts/common.sh@352 -- # local d=1 00:03:39.512 07:28:50 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:39.512 07:28:50 -- scripts/common.sh@354 -- # echo 1 00:03:39.512 07:28:50 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:39.512 07:28:50 -- scripts/common.sh@365 -- # decimal 2 00:03:39.512 07:28:50 -- scripts/common.sh@352 -- # local d=2 00:03:39.512 07:28:50 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:39.512 07:28:50 -- scripts/common.sh@354 -- # echo 2 00:03:39.512 07:28:50 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:39.512 07:28:50 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:39.512 07:28:50 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:39.512 07:28:50 -- scripts/common.sh@367 -- # return 0 00:03:39.512 07:28:50 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- setup/test-setup.sh@10 -- # uname -s 00:03:39.512 07:28:50 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:39.512 07:28:50 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:39.512 07:28:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:39.512 07:28:50 -- common/autotest_common.sh@10 -- # set +x 00:03:39.512 ************************************ 00:03:39.512 START TEST acl 00:03:39.512 ************************************ 00:03:39.512 07:28:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:39.512 * Looking for test storage... 00:03:39.512 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:39.512 07:28:50 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:39.512 07:28:50 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:39.512 07:28:50 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:39.512 07:28:50 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:39.512 07:28:50 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:39.512 07:28:50 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:39.512 07:28:50 -- scripts/common.sh@335 -- # IFS=.-: 00:03:39.512 07:28:50 -- scripts/common.sh@335 -- # read -ra ver1 00:03:39.512 07:28:50 -- scripts/common.sh@336 -- # IFS=.-: 00:03:39.512 07:28:50 -- scripts/common.sh@336 -- # read -ra ver2 00:03:39.512 07:28:50 -- scripts/common.sh@337 -- # local 'op=<' 00:03:39.512 07:28:50 -- scripts/common.sh@339 -- # ver1_l=2 00:03:39.512 07:28:50 -- scripts/common.sh@340 -- # ver2_l=1 00:03:39.512 07:28:50 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:39.512 07:28:50 -- scripts/common.sh@343 -- # case "$op" in 00:03:39.512 07:28:50 -- scripts/common.sh@344 -- # : 1 00:03:39.512 07:28:50 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:39.512 07:28:50 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:39.512 07:28:50 -- scripts/common.sh@364 -- # decimal 1 00:03:39.512 07:28:50 -- scripts/common.sh@352 -- # local d=1 00:03:39.512 07:28:50 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:39.512 07:28:50 -- scripts/common.sh@354 -- # echo 1 00:03:39.512 07:28:50 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:39.512 07:28:50 -- scripts/common.sh@365 -- # decimal 2 00:03:39.512 07:28:50 -- scripts/common.sh@352 -- # local d=2 00:03:39.512 07:28:50 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:39.512 07:28:50 -- scripts/common.sh@354 -- # echo 2 00:03:39.512 07:28:50 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:39.512 07:28:50 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:39.512 07:28:50 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:39.512 07:28:50 -- scripts/common.sh@367 -- # return 0 00:03:39.512 07:28:50 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.512 ' 00:03:39.512 07:28:50 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:39.512 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.512 --rc genhtml_branch_coverage=1 00:03:39.512 --rc genhtml_function_coverage=1 00:03:39.512 --rc genhtml_legend=1 00:03:39.512 --rc geninfo_all_blocks=1 00:03:39.512 --rc geninfo_unexecuted_blocks=1 00:03:39.512 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.513 ' 00:03:39.513 07:28:50 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:39.513 07:28:50 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:39.513 07:28:50 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:39.513 07:28:50 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:39.513 07:28:50 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:39.513 07:28:50 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:39.513 07:28:50 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:39.513 07:28:50 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:39.513 07:28:50 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:39.513 07:28:50 -- setup/acl.sh@12 -- # devs=() 00:03:39.513 07:28:50 -- setup/acl.sh@12 -- # declare -a devs 00:03:39.513 07:28:50 -- setup/acl.sh@13 -- # drivers=() 00:03:39.513 07:28:50 -- setup/acl.sh@13 -- # declare -A drivers 00:03:39.513 07:28:50 -- setup/acl.sh@51 -- # setup reset 00:03:39.513 07:28:50 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:39.513 07:28:50 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:43.711 07:28:53 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:43.711 07:28:53 -- setup/acl.sh@16 -- # local dev driver 00:03:43.711 07:28:53 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:43.711 07:28:53 -- setup/acl.sh@15 -- # setup output status 00:03:43.711 07:28:53 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.711 07:28:53 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:47.024 Hugepages 00:03:47.024 node hugesize free / total 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 00:03:47.024 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.024 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.024 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.024 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # continue 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:47.025 07:28:57 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:47.025 07:28:57 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:47.025 07:28:57 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:47.025 07:28:57 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:47.025 07:28:57 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:47.025 07:28:57 -- setup/acl.sh@54 -- # run_test denied denied 00:03:47.025 07:28:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:47.025 07:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:47.025 07:28:57 -- common/autotest_common.sh@10 -- # set +x 00:03:47.025 ************************************ 00:03:47.025 START TEST denied 00:03:47.025 ************************************ 00:03:47.025 07:28:57 -- common/autotest_common.sh@1114 -- # denied 00:03:47.025 07:28:57 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:47.025 07:28:57 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:47.025 07:28:57 -- setup/acl.sh@38 -- # setup output config 00:03:47.025 07:28:57 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:47.025 07:28:57 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:50.321 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:50.321 07:29:00 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:50.321 07:29:00 -- setup/acl.sh@28 -- # local dev driver 00:03:50.321 07:29:00 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:50.321 07:29:00 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:50.321 07:29:00 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:50.321 07:29:00 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:50.321 07:29:00 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:50.321 07:29:00 -- setup/acl.sh@41 -- # setup reset 00:03:50.321 07:29:00 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:50.321 07:29:00 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:55.612 00:03:55.612 real 0m8.083s 00:03:55.612 user 0m2.646s 00:03:55.612 sys 0m4.823s 00:03:55.612 07:29:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:55.612 07:29:05 -- common/autotest_common.sh@10 -- # set +x 00:03:55.612 ************************************ 00:03:55.612 END TEST denied 00:03:55.612 ************************************ 00:03:55.612 07:29:05 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:55.612 07:29:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:55.612 07:29:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:55.612 07:29:05 -- common/autotest_common.sh@10 -- # set +x 00:03:55.612 ************************************ 00:03:55.612 START TEST allowed 00:03:55.612 ************************************ 00:03:55.612 07:29:05 -- common/autotest_common.sh@1114 -- # allowed 00:03:55.612 07:29:05 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:55.612 07:29:05 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:55.612 07:29:05 -- setup/acl.sh@45 -- # setup output config 00:03:55.612 07:29:05 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:55.612 07:29:05 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:00.888 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:00.888 07:29:10 -- setup/acl.sh@47 -- # verify 00:04:00.888 07:29:10 -- setup/acl.sh@28 -- # local dev driver 00:04:00.888 07:29:10 -- setup/acl.sh@48 -- # setup reset 00:04:00.888 07:29:10 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:00.888 07:29:10 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:04.183 00:04:04.183 real 0m8.980s 00:04:04.183 user 0m2.553s 00:04:04.183 sys 0m4.989s 00:04:04.183 07:29:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:04.183 07:29:14 -- common/autotest_common.sh@10 -- # set +x 00:04:04.183 ************************************ 00:04:04.183 END TEST allowed 00:04:04.183 ************************************ 00:04:04.183 00:04:04.183 real 0m24.508s 00:04:04.183 user 0m7.923s 00:04:04.183 sys 0m14.814s 00:04:04.183 07:29:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:04.183 07:29:14 -- common/autotest_common.sh@10 -- # set +x 00:04:04.183 ************************************ 00:04:04.183 END TEST acl 00:04:04.183 ************************************ 00:04:04.183 07:29:14 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:04.183 07:29:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:04.183 07:29:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:04.183 07:29:14 -- common/autotest_common.sh@10 -- # set +x 00:04:04.183 ************************************ 00:04:04.183 START TEST hugepages 00:04:04.183 ************************************ 00:04:04.183 07:29:14 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:04.183 * Looking for test storage... 00:04:04.183 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:04.183 07:29:14 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:04.183 07:29:14 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:04.183 07:29:14 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:04.183 07:29:14 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:04.183 07:29:14 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:04.183 07:29:14 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:04.183 07:29:14 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:04.183 07:29:14 -- scripts/common.sh@335 -- # IFS=.-: 00:04:04.183 07:29:14 -- scripts/common.sh@335 -- # read -ra ver1 00:04:04.183 07:29:14 -- scripts/common.sh@336 -- # IFS=.-: 00:04:04.183 07:29:14 -- scripts/common.sh@336 -- # read -ra ver2 00:04:04.183 07:29:14 -- scripts/common.sh@337 -- # local 'op=<' 00:04:04.183 07:29:14 -- scripts/common.sh@339 -- # ver1_l=2 00:04:04.183 07:29:14 -- scripts/common.sh@340 -- # ver2_l=1 00:04:04.183 07:29:14 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:04.183 07:29:14 -- scripts/common.sh@343 -- # case "$op" in 00:04:04.183 07:29:14 -- scripts/common.sh@344 -- # : 1 00:04:04.183 07:29:14 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:04.183 07:29:14 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:04.183 07:29:14 -- scripts/common.sh@364 -- # decimal 1 00:04:04.183 07:29:14 -- scripts/common.sh@352 -- # local d=1 00:04:04.183 07:29:14 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:04.183 07:29:14 -- scripts/common.sh@354 -- # echo 1 00:04:04.183 07:29:14 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:04.183 07:29:14 -- scripts/common.sh@365 -- # decimal 2 00:04:04.183 07:29:14 -- scripts/common.sh@352 -- # local d=2 00:04:04.183 07:29:14 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:04.183 07:29:14 -- scripts/common.sh@354 -- # echo 2 00:04:04.183 07:29:14 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:04.183 07:29:14 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:04.183 07:29:14 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:04.183 07:29:14 -- scripts/common.sh@367 -- # return 0 00:04:04.183 07:29:14 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:04.183 07:29:14 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:04.183 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.183 --rc genhtml_branch_coverage=1 00:04:04.183 --rc genhtml_function_coverage=1 00:04:04.183 --rc genhtml_legend=1 00:04:04.183 --rc geninfo_all_blocks=1 00:04:04.183 --rc geninfo_unexecuted_blocks=1 00:04:04.183 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.183 ' 00:04:04.183 07:29:14 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:04.183 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.183 --rc genhtml_branch_coverage=1 00:04:04.183 --rc genhtml_function_coverage=1 00:04:04.183 --rc genhtml_legend=1 00:04:04.183 --rc geninfo_all_blocks=1 00:04:04.183 --rc geninfo_unexecuted_blocks=1 00:04:04.183 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.183 ' 00:04:04.183 07:29:14 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:04.183 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.183 --rc genhtml_branch_coverage=1 00:04:04.183 --rc genhtml_function_coverage=1 00:04:04.183 --rc genhtml_legend=1 00:04:04.183 --rc geninfo_all_blocks=1 00:04:04.183 --rc geninfo_unexecuted_blocks=1 00:04:04.183 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.183 ' 00:04:04.183 07:29:14 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:04.183 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.183 --rc genhtml_branch_coverage=1 00:04:04.183 --rc genhtml_function_coverage=1 00:04:04.183 --rc genhtml_legend=1 00:04:04.183 --rc geninfo_all_blocks=1 00:04:04.183 --rc geninfo_unexecuted_blocks=1 00:04:04.183 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.183 ' 00:04:04.183 07:29:14 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:04.183 07:29:14 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:04.183 07:29:14 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:04.183 07:29:14 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:04.183 07:29:14 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:04.183 07:29:14 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:04.183 07:29:14 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:04.183 07:29:14 -- setup/common.sh@18 -- # local node= 00:04:04.183 07:29:14 -- setup/common.sh@19 -- # local var val 00:04:04.183 07:29:14 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.183 07:29:14 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.183 07:29:14 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.183 07:29:14 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.183 07:29:14 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.183 07:29:14 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.183 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.183 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39982156 kB' 'MemAvailable: 41611404 kB' 'Buffers: 6784 kB' 'Cached: 10644424 kB' 'SwapCached: 76 kB' 'Active: 8056036 kB' 'Inactive: 3183352 kB' 'Active(anon): 7148708 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591532 kB' 'Mapped: 168640 kB' 'Shmem: 8903236 kB' 'KReclaimable: 578076 kB' 'Slab: 1582820 kB' 'SReclaimable: 578076 kB' 'SUnreclaim: 1004744 kB' 'KernelStack: 21856 kB' 'PageTables: 8808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 11424104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217924 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.184 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.184 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # continue 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.185 07:29:14 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.185 07:29:14 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:04.185 07:29:14 -- setup/common.sh@33 -- # echo 2048 00:04:04.185 07:29:14 -- setup/common.sh@33 -- # return 0 00:04:04.185 07:29:14 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:04.185 07:29:14 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:04.185 07:29:14 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:04.185 07:29:14 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:04.185 07:29:14 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:04.185 07:29:14 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:04.185 07:29:14 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:04.185 07:29:14 -- setup/hugepages.sh@207 -- # get_nodes 00:04:04.185 07:29:14 -- setup/hugepages.sh@27 -- # local node 00:04:04.185 07:29:14 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:04.185 07:29:14 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:04.185 07:29:14 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:04.186 07:29:14 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:04.186 07:29:14 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:04.186 07:29:14 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:04.186 07:29:14 -- setup/hugepages.sh@208 -- # clear_hp 00:04:04.186 07:29:14 -- setup/hugepages.sh@37 -- # local node hp 00:04:04.186 07:29:14 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:04.186 07:29:14 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:04.186 07:29:14 -- setup/hugepages.sh@41 -- # echo 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:04.186 07:29:14 -- setup/hugepages.sh@41 -- # echo 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:04.186 07:29:14 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:04.186 07:29:14 -- setup/hugepages.sh@41 -- # echo 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:04.186 07:29:14 -- setup/hugepages.sh@41 -- # echo 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:04.186 07:29:14 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:04.186 07:29:14 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:04.186 07:29:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:04.186 07:29:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:04.186 07:29:14 -- common/autotest_common.sh@10 -- # set +x 00:04:04.186 ************************************ 00:04:04.186 START TEST default_setup 00:04:04.186 ************************************ 00:04:04.186 07:29:14 -- common/autotest_common.sh@1114 -- # default_setup 00:04:04.186 07:29:14 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:04.186 07:29:14 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:04.186 07:29:14 -- setup/hugepages.sh@51 -- # shift 00:04:04.186 07:29:14 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:04.186 07:29:14 -- setup/hugepages.sh@52 -- # local node_ids 00:04:04.186 07:29:14 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:04.186 07:29:14 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:04.186 07:29:14 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:04.186 07:29:14 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:04.186 07:29:14 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:04.186 07:29:14 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:04.186 07:29:14 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:04.186 07:29:14 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:04.186 07:29:14 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:04.186 07:29:14 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:04.186 07:29:14 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:04.186 07:29:14 -- setup/hugepages.sh@73 -- # return 0 00:04:04.186 07:29:14 -- setup/hugepages.sh@137 -- # setup output 00:04:04.186 07:29:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.186 07:29:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:07.476 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:07.476 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:08.861 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:08.861 07:29:19 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:08.861 07:29:19 -- setup/hugepages.sh@89 -- # local node 00:04:08.861 07:29:19 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:08.862 07:29:19 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:08.862 07:29:19 -- setup/hugepages.sh@92 -- # local surp 00:04:08.862 07:29:19 -- setup/hugepages.sh@93 -- # local resv 00:04:08.862 07:29:19 -- setup/hugepages.sh@94 -- # local anon 00:04:08.862 07:29:19 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:08.862 07:29:19 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:08.862 07:29:19 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:08.862 07:29:19 -- setup/common.sh@18 -- # local node= 00:04:08.862 07:29:19 -- setup/common.sh@19 -- # local var val 00:04:08.862 07:29:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.862 07:29:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.862 07:29:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.862 07:29:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.862 07:29:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.862 07:29:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42213140 kB' 'MemAvailable: 43842388 kB' 'Buffers: 6784 kB' 'Cached: 10644556 kB' 'SwapCached: 76 kB' 'Active: 8057872 kB' 'Inactive: 3183352 kB' 'Active(anon): 7150544 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593172 kB' 'Mapped: 168792 kB' 'Shmem: 8903368 kB' 'KReclaimable: 578076 kB' 'Slab: 1581224 kB' 'SReclaimable: 578076 kB' 'SUnreclaim: 1003148 kB' 'KernelStack: 22000 kB' 'PageTables: 9032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11425420 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.862 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.862 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.863 07:29:19 -- setup/common.sh@33 -- # echo 0 00:04:08.863 07:29:19 -- setup/common.sh@33 -- # return 0 00:04:08.863 07:29:19 -- setup/hugepages.sh@97 -- # anon=0 00:04:08.863 07:29:19 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:08.863 07:29:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.863 07:29:19 -- setup/common.sh@18 -- # local node= 00:04:08.863 07:29:19 -- setup/common.sh@19 -- # local var val 00:04:08.863 07:29:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.863 07:29:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.863 07:29:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.863 07:29:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.863 07:29:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.863 07:29:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42213636 kB' 'MemAvailable: 43842884 kB' 'Buffers: 6784 kB' 'Cached: 10644560 kB' 'SwapCached: 76 kB' 'Active: 8058172 kB' 'Inactive: 3183352 kB' 'Active(anon): 7150844 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593452 kB' 'Mapped: 168764 kB' 'Shmem: 8903372 kB' 'KReclaimable: 578076 kB' 'Slab: 1581188 kB' 'SReclaimable: 578076 kB' 'SUnreclaim: 1003112 kB' 'KernelStack: 21856 kB' 'PageTables: 8784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11426948 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.863 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.863 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.864 07:29:19 -- setup/common.sh@33 -- # echo 0 00:04:08.864 07:29:19 -- setup/common.sh@33 -- # return 0 00:04:08.864 07:29:19 -- setup/hugepages.sh@99 -- # surp=0 00:04:08.864 07:29:19 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:08.864 07:29:19 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:08.864 07:29:19 -- setup/common.sh@18 -- # local node= 00:04:08.864 07:29:19 -- setup/common.sh@19 -- # local var val 00:04:08.864 07:29:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.864 07:29:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.864 07:29:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.864 07:29:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.864 07:29:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.864 07:29:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42214180 kB' 'MemAvailable: 43843428 kB' 'Buffers: 6784 kB' 'Cached: 10644572 kB' 'SwapCached: 76 kB' 'Active: 8059504 kB' 'Inactive: 3183352 kB' 'Active(anon): 7152176 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594800 kB' 'Mapped: 168704 kB' 'Shmem: 8903384 kB' 'KReclaimable: 578076 kB' 'Slab: 1581156 kB' 'SReclaimable: 578076 kB' 'SUnreclaim: 1003080 kB' 'KernelStack: 22176 kB' 'PageTables: 9536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11426964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.864 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.864 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.865 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.865 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.866 07:29:19 -- setup/common.sh@33 -- # echo 0 00:04:08.866 07:29:19 -- setup/common.sh@33 -- # return 0 00:04:08.866 07:29:19 -- setup/hugepages.sh@100 -- # resv=0 00:04:08.866 07:29:19 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:08.866 nr_hugepages=1024 00:04:08.866 07:29:19 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:08.866 resv_hugepages=0 00:04:08.866 07:29:19 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:08.866 surplus_hugepages=0 00:04:08.866 07:29:19 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:08.866 anon_hugepages=0 00:04:08.866 07:29:19 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:08.866 07:29:19 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:08.866 07:29:19 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:08.866 07:29:19 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:08.866 07:29:19 -- setup/common.sh@18 -- # local node= 00:04:08.866 07:29:19 -- setup/common.sh@19 -- # local var val 00:04:08.866 07:29:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.866 07:29:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.866 07:29:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.866 07:29:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.866 07:29:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.866 07:29:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.866 07:29:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42232320 kB' 'MemAvailable: 43861568 kB' 'Buffers: 6784 kB' 'Cached: 10644572 kB' 'SwapCached: 76 kB' 'Active: 8058656 kB' 'Inactive: 3183352 kB' 'Active(anon): 7151328 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593972 kB' 'Mapped: 168688 kB' 'Shmem: 8903384 kB' 'KReclaimable: 578076 kB' 'Slab: 1580996 kB' 'SReclaimable: 578076 kB' 'SUnreclaim: 1002920 kB' 'KernelStack: 22192 kB' 'PageTables: 9352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11425460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.866 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.866 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.867 07:29:19 -- setup/common.sh@33 -- # echo 1024 00:04:08.867 07:29:19 -- setup/common.sh@33 -- # return 0 00:04:08.867 07:29:19 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:08.867 07:29:19 -- setup/hugepages.sh@112 -- # get_nodes 00:04:08.867 07:29:19 -- setup/hugepages.sh@27 -- # local node 00:04:08.867 07:29:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.867 07:29:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:08.867 07:29:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.867 07:29:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:08.867 07:29:19 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:08.867 07:29:19 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:08.867 07:29:19 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:08.867 07:29:19 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:08.867 07:29:19 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:08.867 07:29:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.867 07:29:19 -- setup/common.sh@18 -- # local node=0 00:04:08.867 07:29:19 -- setup/common.sh@19 -- # local var val 00:04:08.867 07:29:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.867 07:29:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.867 07:29:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:08.867 07:29:19 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:08.867 07:29:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.867 07:29:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22621092 kB' 'MemUsed: 10013344 kB' 'SwapCached: 44 kB' 'Active: 4986648 kB' 'Inactive: 535260 kB' 'Active(anon): 4209088 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5312844 kB' 'Mapped: 103364 kB' 'AnonPages: 212180 kB' 'Shmem: 4000036 kB' 'KernelStack: 10744 kB' 'PageTables: 4036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397624 kB' 'Slab: 881368 kB' 'SReclaimable: 397624 kB' 'SUnreclaim: 483744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.867 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.867 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # continue 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.868 07:29:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.868 07:29:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.868 07:29:19 -- setup/common.sh@33 -- # echo 0 00:04:08.868 07:29:19 -- setup/common.sh@33 -- # return 0 00:04:08.868 07:29:19 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:08.868 07:29:19 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:08.868 07:29:19 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:08.868 07:29:19 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:08.868 07:29:19 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:08.868 node0=1024 expecting 1024 00:04:08.868 07:29:19 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:08.868 00:04:08.868 real 0m4.685s 00:04:08.868 user 0m1.033s 00:04:08.868 sys 0m2.008s 00:04:08.868 07:29:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:08.868 07:29:19 -- common/autotest_common.sh@10 -- # set +x 00:04:08.868 ************************************ 00:04:08.868 END TEST default_setup 00:04:08.868 ************************************ 00:04:08.868 07:29:19 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:08.868 07:29:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:08.868 07:29:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:08.868 07:29:19 -- common/autotest_common.sh@10 -- # set +x 00:04:08.868 ************************************ 00:04:08.868 START TEST per_node_1G_alloc 00:04:08.868 ************************************ 00:04:08.868 07:29:19 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:04:08.868 07:29:19 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:08.868 07:29:19 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:08.868 07:29:19 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:08.868 07:29:19 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:08.869 07:29:19 -- setup/hugepages.sh@51 -- # shift 00:04:08.869 07:29:19 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:08.869 07:29:19 -- setup/hugepages.sh@52 -- # local node_ids 00:04:08.869 07:29:19 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:08.869 07:29:19 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:08.869 07:29:19 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:08.869 07:29:19 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:08.869 07:29:19 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.869 07:29:19 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:08.869 07:29:19 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.869 07:29:19 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.869 07:29:19 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.869 07:29:19 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:08.869 07:29:19 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:08.869 07:29:19 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:08.869 07:29:19 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:08.869 07:29:19 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:08.869 07:29:19 -- setup/hugepages.sh@73 -- # return 0 00:04:08.869 07:29:19 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:08.869 07:29:19 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:08.869 07:29:19 -- setup/hugepages.sh@146 -- # setup output 00:04:08.869 07:29:19 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.869 07:29:19 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:12.162 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.162 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.424 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.424 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:12.424 07:29:23 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:12.425 07:29:23 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:12.425 07:29:23 -- setup/hugepages.sh@89 -- # local node 00:04:12.425 07:29:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:12.425 07:29:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:12.425 07:29:23 -- setup/hugepages.sh@92 -- # local surp 00:04:12.425 07:29:23 -- setup/hugepages.sh@93 -- # local resv 00:04:12.425 07:29:23 -- setup/hugepages.sh@94 -- # local anon 00:04:12.425 07:29:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:12.425 07:29:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:12.425 07:29:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:12.425 07:29:23 -- setup/common.sh@18 -- # local node= 00:04:12.425 07:29:23 -- setup/common.sh@19 -- # local var val 00:04:12.425 07:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.425 07:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.425 07:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.425 07:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.425 07:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.425 07:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42255040 kB' 'MemAvailable: 43884288 kB' 'Buffers: 6784 kB' 'Cached: 10644668 kB' 'SwapCached: 76 kB' 'Active: 8054768 kB' 'Inactive: 3183352 kB' 'Active(anon): 7147440 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589896 kB' 'Mapped: 167536 kB' 'Shmem: 8903480 kB' 'KReclaimable: 578076 kB' 'Slab: 1581244 kB' 'SReclaimable: 578076 kB' 'SUnreclaim: 1003168 kB' 'KernelStack: 21840 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11415684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.425 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.425 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.454 07:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.455 07:29:23 -- setup/common.sh@33 -- # echo 0 00:04:12.455 07:29:23 -- setup/common.sh@33 -- # return 0 00:04:12.455 07:29:23 -- setup/hugepages.sh@97 -- # anon=0 00:04:12.455 07:29:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:12.455 07:29:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.455 07:29:23 -- setup/common.sh@18 -- # local node= 00:04:12.455 07:29:23 -- setup/common.sh@19 -- # local var val 00:04:12.455 07:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.455 07:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.455 07:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.455 07:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.455 07:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.455 07:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42255140 kB' 'MemAvailable: 43884324 kB' 'Buffers: 6784 kB' 'Cached: 10644672 kB' 'SwapCached: 76 kB' 'Active: 8054184 kB' 'Inactive: 3183352 kB' 'Active(anon): 7146856 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589300 kB' 'Mapped: 167536 kB' 'Shmem: 8903484 kB' 'KReclaimable: 578012 kB' 'Slab: 1581196 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003184 kB' 'KernelStack: 21840 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11415696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.455 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.455 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.456 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.456 07:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.457 07:29:23 -- setup/common.sh@33 -- # echo 0 00:04:12.457 07:29:23 -- setup/common.sh@33 -- # return 0 00:04:12.457 07:29:23 -- setup/hugepages.sh@99 -- # surp=0 00:04:12.457 07:29:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:12.457 07:29:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:12.457 07:29:23 -- setup/common.sh@18 -- # local node= 00:04:12.457 07:29:23 -- setup/common.sh@19 -- # local var val 00:04:12.457 07:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.457 07:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.457 07:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.457 07:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.457 07:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.457 07:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42255332 kB' 'MemAvailable: 43884516 kB' 'Buffers: 6784 kB' 'Cached: 10644684 kB' 'SwapCached: 76 kB' 'Active: 8054056 kB' 'Inactive: 3183352 kB' 'Active(anon): 7146728 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589128 kB' 'Mapped: 167536 kB' 'Shmem: 8903496 kB' 'KReclaimable: 578012 kB' 'Slab: 1581196 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003184 kB' 'KernelStack: 21824 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11415708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.457 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.457 07:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.458 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:12.458 07:29:23 -- setup/common.sh@33 -- # echo 0 00:04:12.458 07:29:23 -- setup/common.sh@33 -- # return 0 00:04:12.458 07:29:23 -- setup/hugepages.sh@100 -- # resv=0 00:04:12.458 07:29:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:12.458 nr_hugepages=1024 00:04:12.458 07:29:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:12.458 resv_hugepages=0 00:04:12.458 07:29:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:12.458 surplus_hugepages=0 00:04:12.458 07:29:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:12.458 anon_hugepages=0 00:04:12.458 07:29:23 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:12.458 07:29:23 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:12.458 07:29:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:12.458 07:29:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:12.458 07:29:23 -- setup/common.sh@18 -- # local node= 00:04:12.458 07:29:23 -- setup/common.sh@19 -- # local var val 00:04:12.458 07:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.458 07:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.458 07:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.458 07:29:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.458 07:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.458 07:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.458 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42255972 kB' 'MemAvailable: 43885156 kB' 'Buffers: 6784 kB' 'Cached: 10644700 kB' 'SwapCached: 76 kB' 'Active: 8054224 kB' 'Inactive: 3183352 kB' 'Active(anon): 7146896 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589308 kB' 'Mapped: 167536 kB' 'Shmem: 8903512 kB' 'KReclaimable: 578012 kB' 'Slab: 1581196 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003184 kB' 'KernelStack: 21840 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11415724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.459 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.459 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:12.460 07:29:23 -- setup/common.sh@33 -- # echo 1024 00:04:12.460 07:29:23 -- setup/common.sh@33 -- # return 0 00:04:12.460 07:29:23 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:12.460 07:29:23 -- setup/hugepages.sh@112 -- # get_nodes 00:04:12.460 07:29:23 -- setup/hugepages.sh@27 -- # local node 00:04:12.460 07:29:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.460 07:29:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:12.460 07:29:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:12.460 07:29:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:12.460 07:29:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:12.460 07:29:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:12.460 07:29:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:12.460 07:29:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:12.460 07:29:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:12.460 07:29:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.460 07:29:23 -- setup/common.sh@18 -- # local node=0 00:04:12.460 07:29:23 -- setup/common.sh@19 -- # local var val 00:04:12.460 07:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.460 07:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.460 07:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:12.460 07:29:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:12.460 07:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.460 07:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.460 07:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23678556 kB' 'MemUsed: 8955880 kB' 'SwapCached: 44 kB' 'Active: 4985340 kB' 'Inactive: 535260 kB' 'Active(anon): 4207780 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5312960 kB' 'Mapped: 102552 kB' 'AnonPages: 210780 kB' 'Shmem: 4000152 kB' 'KernelStack: 10760 kB' 'PageTables: 4024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397560 kB' 'Slab: 881396 kB' 'SReclaimable: 397560 kB' 'SUnreclaim: 483836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.460 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.460 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.461 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.461 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@33 -- # echo 0 00:04:12.721 07:29:23 -- setup/common.sh@33 -- # return 0 00:04:12.721 07:29:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:12.721 07:29:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:12.721 07:29:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:12.721 07:29:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:12.721 07:29:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:12.721 07:29:23 -- setup/common.sh@18 -- # local node=1 00:04:12.721 07:29:23 -- setup/common.sh@19 -- # local var val 00:04:12.721 07:29:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.721 07:29:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.721 07:29:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:12.721 07:29:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:12.721 07:29:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.721 07:29:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18578088 kB' 'MemUsed: 9071272 kB' 'SwapCached: 32 kB' 'Active: 3068548 kB' 'Inactive: 2648092 kB' 'Active(anon): 2938780 kB' 'Inactive(anon): 2342652 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5338624 kB' 'Mapped: 64984 kB' 'AnonPages: 378140 kB' 'Shmem: 4903384 kB' 'KernelStack: 11064 kB' 'PageTables: 4456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 180452 kB' 'Slab: 699800 kB' 'SReclaimable: 180452 kB' 'SUnreclaim: 519348 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.721 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.721 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # continue 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.722 07:29:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.722 07:29:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:12.722 07:29:23 -- setup/common.sh@33 -- # echo 0 00:04:12.722 07:29:23 -- setup/common.sh@33 -- # return 0 00:04:12.722 07:29:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:12.722 07:29:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:12.722 07:29:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:12.722 07:29:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:12.722 07:29:23 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:12.722 node0=512 expecting 512 00:04:12.722 07:29:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:12.722 07:29:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:12.722 07:29:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:12.722 07:29:23 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:12.722 node1=512 expecting 512 00:04:12.722 07:29:23 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:12.722 00:04:12.722 real 0m3.638s 00:04:12.722 user 0m1.394s 00:04:12.722 sys 0m2.304s 00:04:12.722 07:29:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:12.722 07:29:23 -- common/autotest_common.sh@10 -- # set +x 00:04:12.722 ************************************ 00:04:12.722 END TEST per_node_1G_alloc 00:04:12.722 ************************************ 00:04:12.722 07:29:23 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:12.722 07:29:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:12.722 07:29:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:12.722 07:29:23 -- common/autotest_common.sh@10 -- # set +x 00:04:12.722 ************************************ 00:04:12.722 START TEST even_2G_alloc 00:04:12.722 ************************************ 00:04:12.722 07:29:23 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:12.722 07:29:23 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:12.722 07:29:23 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:12.722 07:29:23 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:12.722 07:29:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:12.722 07:29:23 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:12.722 07:29:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:12.722 07:29:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:12.722 07:29:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:12.723 07:29:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:12.723 07:29:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:12.723 07:29:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:12.723 07:29:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:12.723 07:29:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:12.723 07:29:23 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:12.723 07:29:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:12.723 07:29:23 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:12.723 07:29:23 -- setup/hugepages.sh@83 -- # : 512 00:04:12.723 07:29:23 -- setup/hugepages.sh@84 -- # : 1 00:04:12.723 07:29:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:12.723 07:29:23 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:12.723 07:29:23 -- setup/hugepages.sh@83 -- # : 0 00:04:12.723 07:29:23 -- setup/hugepages.sh@84 -- # : 0 00:04:12.723 07:29:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:12.723 07:29:23 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:12.723 07:29:23 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:12.723 07:29:23 -- setup/hugepages.sh@153 -- # setup output 00:04:12.723 07:29:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.723 07:29:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:16.017 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.017 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:16.017 07:29:26 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:16.017 07:29:26 -- setup/hugepages.sh@89 -- # local node 00:04:16.017 07:29:26 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:16.017 07:29:26 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:16.017 07:29:26 -- setup/hugepages.sh@92 -- # local surp 00:04:16.017 07:29:26 -- setup/hugepages.sh@93 -- # local resv 00:04:16.017 07:29:26 -- setup/hugepages.sh@94 -- # local anon 00:04:16.017 07:29:26 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:16.017 07:29:26 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:16.017 07:29:26 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:16.017 07:29:26 -- setup/common.sh@18 -- # local node= 00:04:16.017 07:29:26 -- setup/common.sh@19 -- # local var val 00:04:16.017 07:29:26 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.017 07:29:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.017 07:29:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.017 07:29:26 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.017 07:29:26 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.017 07:29:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.017 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.017 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42315172 kB' 'MemAvailable: 43944356 kB' 'Buffers: 6784 kB' 'Cached: 10644804 kB' 'SwapCached: 76 kB' 'Active: 8055176 kB' 'Inactive: 3183352 kB' 'Active(anon): 7147848 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590104 kB' 'Mapped: 167624 kB' 'Shmem: 8903616 kB' 'KReclaimable: 578012 kB' 'Slab: 1581380 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003368 kB' 'KernelStack: 21840 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11416332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.018 07:29:26 -- setup/common.sh@33 -- # echo 0 00:04:16.018 07:29:26 -- setup/common.sh@33 -- # return 0 00:04:16.018 07:29:26 -- setup/hugepages.sh@97 -- # anon=0 00:04:16.018 07:29:26 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:16.018 07:29:26 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.018 07:29:26 -- setup/common.sh@18 -- # local node= 00:04:16.018 07:29:26 -- setup/common.sh@19 -- # local var val 00:04:16.018 07:29:26 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.018 07:29:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.018 07:29:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.018 07:29:26 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.018 07:29:26 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.018 07:29:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42315224 kB' 'MemAvailable: 43944408 kB' 'Buffers: 6784 kB' 'Cached: 10644804 kB' 'SwapCached: 76 kB' 'Active: 8054904 kB' 'Inactive: 3183352 kB' 'Active(anon): 7147576 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589856 kB' 'Mapped: 167544 kB' 'Shmem: 8903616 kB' 'KReclaimable: 578012 kB' 'Slab: 1581360 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003348 kB' 'KernelStack: 21840 kB' 'PageTables: 8532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11416344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.018 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.018 07:29:26 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.019 07:29:26 -- setup/common.sh@33 -- # echo 0 00:04:16.019 07:29:26 -- setup/common.sh@33 -- # return 0 00:04:16.019 07:29:26 -- setup/hugepages.sh@99 -- # surp=0 00:04:16.019 07:29:26 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:16.019 07:29:26 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:16.019 07:29:26 -- setup/common.sh@18 -- # local node= 00:04:16.019 07:29:26 -- setup/common.sh@19 -- # local var val 00:04:16.019 07:29:26 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.019 07:29:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.019 07:29:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.019 07:29:26 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.019 07:29:26 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.019 07:29:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.019 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.019 07:29:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42315728 kB' 'MemAvailable: 43944912 kB' 'Buffers: 6784 kB' 'Cached: 10644804 kB' 'SwapCached: 76 kB' 'Active: 8054940 kB' 'Inactive: 3183352 kB' 'Active(anon): 7147612 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589888 kB' 'Mapped: 167544 kB' 'Shmem: 8903616 kB' 'KReclaimable: 578012 kB' 'Slab: 1581360 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003348 kB' 'KernelStack: 21856 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11416356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.019 07:29:26 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.020 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.020 07:29:26 -- setup/common.sh@33 -- # echo 0 00:04:16.020 07:29:26 -- setup/common.sh@33 -- # return 0 00:04:16.020 07:29:26 -- setup/hugepages.sh@100 -- # resv=0 00:04:16.020 07:29:26 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:16.020 nr_hugepages=1024 00:04:16.020 07:29:26 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:16.020 resv_hugepages=0 00:04:16.020 07:29:26 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:16.020 surplus_hugepages=0 00:04:16.020 07:29:26 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:16.020 anon_hugepages=0 00:04:16.020 07:29:26 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.020 07:29:26 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:16.020 07:29:26 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:16.020 07:29:26 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:16.020 07:29:26 -- setup/common.sh@18 -- # local node= 00:04:16.020 07:29:26 -- setup/common.sh@19 -- # local var val 00:04:16.020 07:29:26 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.020 07:29:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.020 07:29:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.020 07:29:26 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.020 07:29:26 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.020 07:29:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.020 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42316232 kB' 'MemAvailable: 43945416 kB' 'Buffers: 6784 kB' 'Cached: 10644804 kB' 'SwapCached: 76 kB' 'Active: 8054940 kB' 'Inactive: 3183352 kB' 'Active(anon): 7147612 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589888 kB' 'Mapped: 167544 kB' 'Shmem: 8903616 kB' 'KReclaimable: 578012 kB' 'Slab: 1581360 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1003348 kB' 'KernelStack: 21856 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11416372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.021 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.021 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.022 07:29:26 -- setup/common.sh@33 -- # echo 1024 00:04:16.022 07:29:26 -- setup/common.sh@33 -- # return 0 00:04:16.022 07:29:26 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.022 07:29:26 -- setup/hugepages.sh@112 -- # get_nodes 00:04:16.022 07:29:26 -- setup/hugepages.sh@27 -- # local node 00:04:16.022 07:29:26 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.022 07:29:26 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.022 07:29:26 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.022 07:29:26 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.022 07:29:26 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:16.022 07:29:26 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:16.022 07:29:26 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.022 07:29:26 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.022 07:29:26 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:16.022 07:29:26 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.022 07:29:26 -- setup/common.sh@18 -- # local node=0 00:04:16.022 07:29:26 -- setup/common.sh@19 -- # local var val 00:04:16.022 07:29:26 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.022 07:29:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.022 07:29:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:16.022 07:29:26 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:16.022 07:29:26 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.022 07:29:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.022 07:29:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23710676 kB' 'MemUsed: 8923760 kB' 'SwapCached: 44 kB' 'Active: 4984956 kB' 'Inactive: 535260 kB' 'Active(anon): 4207396 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5312988 kB' 'Mapped: 102552 kB' 'AnonPages: 210384 kB' 'Shmem: 4000180 kB' 'KernelStack: 10728 kB' 'PageTables: 4024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397560 kB' 'Slab: 881872 kB' 'SReclaimable: 397560 kB' 'SUnreclaim: 484312 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@33 -- # echo 0 00:04:16.022 07:29:26 -- setup/common.sh@33 -- # return 0 00:04:16.022 07:29:26 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.022 07:29:26 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.022 07:29:26 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.022 07:29:26 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:16.022 07:29:26 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.022 07:29:26 -- setup/common.sh@18 -- # local node=1 00:04:16.022 07:29:26 -- setup/common.sh@19 -- # local var val 00:04:16.022 07:29:26 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.022 07:29:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.022 07:29:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:16.022 07:29:26 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:16.022 07:29:26 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.022 07:29:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18605324 kB' 'MemUsed: 9044036 kB' 'SwapCached: 32 kB' 'Active: 3069976 kB' 'Inactive: 2648092 kB' 'Active(anon): 2940208 kB' 'Inactive(anon): 2342652 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5338732 kB' 'Mapped: 64992 kB' 'AnonPages: 379396 kB' 'Shmem: 4903492 kB' 'KernelStack: 11096 kB' 'PageTables: 4452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 180452 kB' 'Slab: 699480 kB' 'SReclaimable: 180452 kB' 'SUnreclaim: 519028 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.022 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.022 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # continue 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.023 07:29:26 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.023 07:29:26 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.023 07:29:26 -- setup/common.sh@33 -- # echo 0 00:04:16.023 07:29:26 -- setup/common.sh@33 -- # return 0 00:04:16.023 07:29:26 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.023 07:29:26 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.023 07:29:26 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.023 07:29:26 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:16.023 node0=512 expecting 512 00:04:16.023 07:29:26 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.023 07:29:26 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.023 07:29:26 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.023 07:29:26 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:16.023 node1=512 expecting 512 00:04:16.023 07:29:26 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:16.023 00:04:16.023 real 0m3.262s 00:04:16.023 user 0m1.086s 00:04:16.023 sys 0m2.187s 00:04:16.023 07:29:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:16.023 07:29:26 -- common/autotest_common.sh@10 -- # set +x 00:04:16.023 ************************************ 00:04:16.023 END TEST even_2G_alloc 00:04:16.023 ************************************ 00:04:16.023 07:29:26 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:16.023 07:29:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:16.023 07:29:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:16.023 07:29:26 -- common/autotest_common.sh@10 -- # set +x 00:04:16.023 ************************************ 00:04:16.023 START TEST odd_alloc 00:04:16.023 ************************************ 00:04:16.023 07:29:26 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:16.023 07:29:26 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:16.023 07:29:26 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:16.023 07:29:26 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:16.023 07:29:26 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:16.023 07:29:26 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:16.023 07:29:26 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.023 07:29:26 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:16.023 07:29:26 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.023 07:29:26 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.023 07:29:26 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.023 07:29:26 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:16.023 07:29:26 -- setup/hugepages.sh@83 -- # : 513 00:04:16.023 07:29:26 -- setup/hugepages.sh@84 -- # : 1 00:04:16.023 07:29:26 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:16.023 07:29:26 -- setup/hugepages.sh@83 -- # : 0 00:04:16.023 07:29:26 -- setup/hugepages.sh@84 -- # : 0 00:04:16.023 07:29:26 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.023 07:29:26 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:16.023 07:29:26 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:16.023 07:29:26 -- setup/hugepages.sh@160 -- # setup output 00:04:16.023 07:29:26 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.023 07:29:26 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:19.321 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:19.321 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:19.321 07:29:29 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:19.321 07:29:29 -- setup/hugepages.sh@89 -- # local node 00:04:19.321 07:29:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:19.321 07:29:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:19.321 07:29:29 -- setup/hugepages.sh@92 -- # local surp 00:04:19.321 07:29:29 -- setup/hugepages.sh@93 -- # local resv 00:04:19.321 07:29:29 -- setup/hugepages.sh@94 -- # local anon 00:04:19.321 07:29:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:19.321 07:29:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:19.321 07:29:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:19.321 07:29:29 -- setup/common.sh@18 -- # local node= 00:04:19.321 07:29:29 -- setup/common.sh@19 -- # local var val 00:04:19.321 07:29:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.321 07:29:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.321 07:29:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.321 07:29:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.321 07:29:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.321 07:29:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42311444 kB' 'MemAvailable: 43940628 kB' 'Buffers: 6784 kB' 'Cached: 10644944 kB' 'SwapCached: 76 kB' 'Active: 8056492 kB' 'Inactive: 3183352 kB' 'Active(anon): 7149164 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591312 kB' 'Mapped: 167568 kB' 'Shmem: 8903756 kB' 'KReclaimable: 578012 kB' 'Slab: 1580600 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1002588 kB' 'KernelStack: 21856 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11417000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.321 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.321 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:19.322 07:29:29 -- setup/common.sh@33 -- # echo 0 00:04:19.322 07:29:29 -- setup/common.sh@33 -- # return 0 00:04:19.322 07:29:29 -- setup/hugepages.sh@97 -- # anon=0 00:04:19.322 07:29:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:19.322 07:29:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:19.322 07:29:29 -- setup/common.sh@18 -- # local node= 00:04:19.322 07:29:29 -- setup/common.sh@19 -- # local var val 00:04:19.322 07:29:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.322 07:29:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.322 07:29:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.322 07:29:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.322 07:29:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.322 07:29:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42311056 kB' 'MemAvailable: 43940240 kB' 'Buffers: 6784 kB' 'Cached: 10644932 kB' 'SwapCached: 76 kB' 'Active: 8056208 kB' 'Inactive: 3183352 kB' 'Active(anon): 7148880 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591044 kB' 'Mapped: 167548 kB' 'Shmem: 8903744 kB' 'KReclaimable: 578012 kB' 'Slab: 1580712 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1002700 kB' 'KernelStack: 21824 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11416996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.322 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.322 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.323 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.323 07:29:29 -- setup/common.sh@33 -- # echo 0 00:04:19.323 07:29:29 -- setup/common.sh@33 -- # return 0 00:04:19.323 07:29:29 -- setup/hugepages.sh@99 -- # surp=0 00:04:19.323 07:29:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:19.323 07:29:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:19.323 07:29:29 -- setup/common.sh@18 -- # local node= 00:04:19.323 07:29:29 -- setup/common.sh@19 -- # local var val 00:04:19.323 07:29:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.323 07:29:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.323 07:29:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.323 07:29:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.323 07:29:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.323 07:29:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.323 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42311056 kB' 'MemAvailable: 43940240 kB' 'Buffers: 6784 kB' 'Cached: 10644932 kB' 'SwapCached: 76 kB' 'Active: 8056240 kB' 'Inactive: 3183352 kB' 'Active(anon): 7148912 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591080 kB' 'Mapped: 167548 kB' 'Shmem: 8903744 kB' 'KReclaimable: 578012 kB' 'Slab: 1580712 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1002700 kB' 'KernelStack: 21840 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11417012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.324 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.324 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:19.325 07:29:29 -- setup/common.sh@33 -- # echo 0 00:04:19.325 07:29:29 -- setup/common.sh@33 -- # return 0 00:04:19.325 07:29:29 -- setup/hugepages.sh@100 -- # resv=0 00:04:19.325 07:29:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:19.325 nr_hugepages=1025 00:04:19.325 07:29:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:19.325 resv_hugepages=0 00:04:19.325 07:29:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:19.325 surplus_hugepages=0 00:04:19.325 07:29:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:19.325 anon_hugepages=0 00:04:19.325 07:29:29 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:19.325 07:29:29 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:19.325 07:29:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:19.325 07:29:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:19.325 07:29:29 -- setup/common.sh@18 -- # local node= 00:04:19.325 07:29:29 -- setup/common.sh@19 -- # local var val 00:04:19.325 07:29:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.325 07:29:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.325 07:29:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:19.325 07:29:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:19.325 07:29:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.325 07:29:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42311180 kB' 'MemAvailable: 43940364 kB' 'Buffers: 6784 kB' 'Cached: 10644936 kB' 'SwapCached: 76 kB' 'Active: 8056444 kB' 'Inactive: 3183352 kB' 'Active(anon): 7149116 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591276 kB' 'Mapped: 167548 kB' 'Shmem: 8903748 kB' 'KReclaimable: 578012 kB' 'Slab: 1580712 kB' 'SReclaimable: 578012 kB' 'SUnreclaim: 1002700 kB' 'KernelStack: 21840 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11417024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.325 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.325 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.326 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.326 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:19.326 07:29:29 -- setup/common.sh@33 -- # echo 1025 00:04:19.326 07:29:29 -- setup/common.sh@33 -- # return 0 00:04:19.326 07:29:29 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:19.326 07:29:29 -- setup/hugepages.sh@112 -- # get_nodes 00:04:19.326 07:29:29 -- setup/hugepages.sh@27 -- # local node 00:04:19.326 07:29:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:19.327 07:29:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:19.327 07:29:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:19.327 07:29:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:19.327 07:29:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:19.327 07:29:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:19.327 07:29:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:19.327 07:29:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:19.327 07:29:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:19.327 07:29:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:19.327 07:29:29 -- setup/common.sh@18 -- # local node=0 00:04:19.327 07:29:29 -- setup/common.sh@19 -- # local var val 00:04:19.327 07:29:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.327 07:29:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.327 07:29:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:19.327 07:29:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:19.327 07:29:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.327 07:29:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23696948 kB' 'MemUsed: 8937488 kB' 'SwapCached: 44 kB' 'Active: 4985432 kB' 'Inactive: 535260 kB' 'Active(anon): 4207872 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5312988 kB' 'Mapped: 102552 kB' 'AnonPages: 210840 kB' 'Shmem: 4000180 kB' 'KernelStack: 10728 kB' 'PageTables: 3980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397560 kB' 'Slab: 881200 kB' 'SReclaimable: 397560 kB' 'SUnreclaim: 483640 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.327 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.327 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@33 -- # echo 0 00:04:19.328 07:29:29 -- setup/common.sh@33 -- # return 0 00:04:19.328 07:29:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:19.328 07:29:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:19.328 07:29:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:19.328 07:29:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:19.328 07:29:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:19.328 07:29:29 -- setup/common.sh@18 -- # local node=1 00:04:19.328 07:29:29 -- setup/common.sh@19 -- # local var val 00:04:19.328 07:29:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:19.328 07:29:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:19.328 07:29:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:19.328 07:29:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:19.328 07:29:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:19.328 07:29:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:19.328 07:29:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18614060 kB' 'MemUsed: 9035300 kB' 'SwapCached: 32 kB' 'Active: 3070808 kB' 'Inactive: 2648092 kB' 'Active(anon): 2941040 kB' 'Inactive(anon): 2342652 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5338856 kB' 'Mapped: 64996 kB' 'AnonPages: 380204 kB' 'Shmem: 4903616 kB' 'KernelStack: 11096 kB' 'PageTables: 4556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 180452 kB' 'Slab: 699512 kB' 'SReclaimable: 180452 kB' 'SUnreclaim: 519060 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.328 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.328 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # continue 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:19.329 07:29:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:19.329 07:29:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:19.329 07:29:29 -- setup/common.sh@33 -- # echo 0 00:04:19.329 07:29:29 -- setup/common.sh@33 -- # return 0 00:04:19.329 07:29:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:19.329 07:29:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:19.329 07:29:29 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:19.329 node0=512 expecting 513 00:04:19.329 07:29:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:19.329 07:29:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:19.329 07:29:29 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:19.329 node1=513 expecting 512 00:04:19.329 07:29:29 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:19.329 00:04:19.329 real 0m3.139s 00:04:19.329 user 0m1.018s 00:04:19.329 sys 0m2.059s 00:04:19.329 07:29:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:19.329 07:29:29 -- common/autotest_common.sh@10 -- # set +x 00:04:19.329 ************************************ 00:04:19.329 END TEST odd_alloc 00:04:19.329 ************************************ 00:04:19.329 07:29:29 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:19.329 07:29:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:19.329 07:29:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:19.329 07:29:29 -- common/autotest_common.sh@10 -- # set +x 00:04:19.329 ************************************ 00:04:19.329 START TEST custom_alloc 00:04:19.329 ************************************ 00:04:19.329 07:29:29 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:19.329 07:29:29 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:19.329 07:29:29 -- setup/hugepages.sh@169 -- # local node 00:04:19.329 07:29:29 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:19.329 07:29:29 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:19.329 07:29:29 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:19.329 07:29:29 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:19.329 07:29:29 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:19.329 07:29:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:19.329 07:29:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:19.329 07:29:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:19.329 07:29:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:19.329 07:29:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:19.329 07:29:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:19.329 07:29:29 -- setup/hugepages.sh@83 -- # : 256 00:04:19.329 07:29:29 -- setup/hugepages.sh@84 -- # : 1 00:04:19.329 07:29:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:19.329 07:29:29 -- setup/hugepages.sh@83 -- # : 0 00:04:19.329 07:29:29 -- setup/hugepages.sh@84 -- # : 0 00:04:19.329 07:29:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:19.329 07:29:29 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:19.329 07:29:29 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:19.329 07:29:29 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:19.329 07:29:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:19.329 07:29:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:19.329 07:29:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:19.329 07:29:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:19.329 07:29:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:19.329 07:29:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:19.329 07:29:29 -- setup/hugepages.sh@78 -- # return 0 00:04:19.329 07:29:29 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:19.329 07:29:29 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:19.329 07:29:29 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:19.329 07:29:29 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:19.329 07:29:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:19.329 07:29:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:19.329 07:29:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:19.329 07:29:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:19.329 07:29:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:19.329 07:29:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:19.329 07:29:29 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:19.329 07:29:29 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:19.329 07:29:29 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:19.329 07:29:29 -- setup/hugepages.sh@78 -- # return 0 00:04:19.329 07:29:29 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:19.329 07:29:29 -- setup/hugepages.sh@187 -- # setup output 00:04:19.329 07:29:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:19.329 07:29:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:21.866 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.866 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:22.131 07:29:32 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:22.131 07:29:32 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:22.131 07:29:32 -- setup/hugepages.sh@89 -- # local node 00:04:22.131 07:29:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:22.131 07:29:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:22.131 07:29:32 -- setup/hugepages.sh@92 -- # local surp 00:04:22.131 07:29:32 -- setup/hugepages.sh@93 -- # local resv 00:04:22.131 07:29:32 -- setup/hugepages.sh@94 -- # local anon 00:04:22.131 07:29:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:22.131 07:29:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:22.131 07:29:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:22.131 07:29:32 -- setup/common.sh@18 -- # local node= 00:04:22.131 07:29:32 -- setup/common.sh@19 -- # local var val 00:04:22.131 07:29:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:22.131 07:29:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.131 07:29:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.131 07:29:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.131 07:29:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.131 07:29:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41269700 kB' 'MemAvailable: 42898852 kB' 'Buffers: 6784 kB' 'Cached: 10645060 kB' 'SwapCached: 76 kB' 'Active: 8060040 kB' 'Inactive: 3183352 kB' 'Active(anon): 7152712 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594824 kB' 'Mapped: 168072 kB' 'Shmem: 8903872 kB' 'KReclaimable: 577980 kB' 'Slab: 1581176 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003196 kB' 'KernelStack: 21856 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11435876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.131 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.131 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.132 07:29:32 -- setup/common.sh@33 -- # echo 0 00:04:22.132 07:29:32 -- setup/common.sh@33 -- # return 0 00:04:22.132 07:29:32 -- setup/hugepages.sh@97 -- # anon=0 00:04:22.132 07:29:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:22.132 07:29:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.132 07:29:32 -- setup/common.sh@18 -- # local node= 00:04:22.132 07:29:32 -- setup/common.sh@19 -- # local var val 00:04:22.132 07:29:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:22.132 07:29:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.132 07:29:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.132 07:29:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.132 07:29:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.132 07:29:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41272924 kB' 'MemAvailable: 42902076 kB' 'Buffers: 6784 kB' 'Cached: 10645064 kB' 'SwapCached: 76 kB' 'Active: 8056348 kB' 'Inactive: 3183352 kB' 'Active(anon): 7149020 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591084 kB' 'Mapped: 167940 kB' 'Shmem: 8903876 kB' 'KReclaimable: 577980 kB' 'Slab: 1581176 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003196 kB' 'KernelStack: 21824 kB' 'PageTables: 8552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11416792 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.132 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.132 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.133 07:29:32 -- setup/common.sh@33 -- # echo 0 00:04:22.133 07:29:32 -- setup/common.sh@33 -- # return 0 00:04:22.133 07:29:32 -- setup/hugepages.sh@99 -- # surp=0 00:04:22.133 07:29:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:22.133 07:29:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:22.133 07:29:32 -- setup/common.sh@18 -- # local node= 00:04:22.133 07:29:32 -- setup/common.sh@19 -- # local var val 00:04:22.133 07:29:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:22.133 07:29:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.133 07:29:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.133 07:29:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.133 07:29:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.133 07:29:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41271116 kB' 'MemAvailable: 42900268 kB' 'Buffers: 6784 kB' 'Cached: 10645080 kB' 'SwapCached: 76 kB' 'Active: 8059912 kB' 'Inactive: 3183352 kB' 'Active(anon): 7152584 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 594560 kB' 'Mapped: 168056 kB' 'Shmem: 8903892 kB' 'KReclaimable: 577980 kB' 'Slab: 1581208 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003228 kB' 'KernelStack: 21776 kB' 'PageTables: 8360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11422100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.133 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.133 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.134 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.134 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.135 07:29:32 -- setup/common.sh@33 -- # echo 0 00:04:22.135 07:29:32 -- setup/common.sh@33 -- # return 0 00:04:22.135 07:29:32 -- setup/hugepages.sh@100 -- # resv=0 00:04:22.135 07:29:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:22.135 nr_hugepages=1536 00:04:22.135 07:29:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:22.135 resv_hugepages=0 00:04:22.135 07:29:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:22.135 surplus_hugepages=0 00:04:22.135 07:29:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:22.135 anon_hugepages=0 00:04:22.135 07:29:32 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:22.135 07:29:32 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:22.135 07:29:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:22.135 07:29:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:22.135 07:29:32 -- setup/common.sh@18 -- # local node= 00:04:22.135 07:29:32 -- setup/common.sh@19 -- # local var val 00:04:22.135 07:29:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:22.135 07:29:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.135 07:29:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.135 07:29:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.135 07:29:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.135 07:29:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.135 07:29:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41270020 kB' 'MemAvailable: 42899172 kB' 'Buffers: 6784 kB' 'Cached: 10645100 kB' 'SwapCached: 76 kB' 'Active: 8055796 kB' 'Inactive: 3183352 kB' 'Active(anon): 7148468 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590308 kB' 'Mapped: 167552 kB' 'Shmem: 8903912 kB' 'KReclaimable: 577980 kB' 'Slab: 1581208 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003228 kB' 'KernelStack: 21760 kB' 'PageTables: 8324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11417328 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.135 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.135 07:29:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.136 07:29:32 -- setup/common.sh@33 -- # echo 1536 00:04:22.136 07:29:32 -- setup/common.sh@33 -- # return 0 00:04:22.136 07:29:32 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:22.136 07:29:32 -- setup/hugepages.sh@112 -- # get_nodes 00:04:22.136 07:29:32 -- setup/hugepages.sh@27 -- # local node 00:04:22.136 07:29:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:22.136 07:29:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:22.136 07:29:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:22.136 07:29:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:22.136 07:29:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:22.136 07:29:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:22.136 07:29:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:22.136 07:29:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:22.136 07:29:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:22.136 07:29:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.136 07:29:32 -- setup/common.sh@18 -- # local node=0 00:04:22.136 07:29:32 -- setup/common.sh@19 -- # local var val 00:04:22.136 07:29:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:22.136 07:29:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.136 07:29:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:22.136 07:29:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:22.136 07:29:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.136 07:29:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.136 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.136 07:29:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23688192 kB' 'MemUsed: 8946244 kB' 'SwapCached: 44 kB' 'Active: 4985348 kB' 'Inactive: 535260 kB' 'Active(anon): 4207788 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5313008 kB' 'Mapped: 102552 kB' 'AnonPages: 210744 kB' 'Shmem: 4000200 kB' 'KernelStack: 10744 kB' 'PageTables: 4020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397528 kB' 'Slab: 881968 kB' 'SReclaimable: 397528 kB' 'SUnreclaim: 484440 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.137 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.137 07:29:32 -- setup/common.sh@33 -- # echo 0 00:04:22.137 07:29:32 -- setup/common.sh@33 -- # return 0 00:04:22.137 07:29:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:22.137 07:29:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:22.137 07:29:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:22.137 07:29:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:22.137 07:29:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.137 07:29:32 -- setup/common.sh@18 -- # local node=1 00:04:22.137 07:29:32 -- setup/common.sh@19 -- # local var val 00:04:22.137 07:29:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:22.137 07:29:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.137 07:29:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:22.137 07:29:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:22.137 07:29:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.137 07:29:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.137 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17581640 kB' 'MemUsed: 10067720 kB' 'SwapCached: 32 kB' 'Active: 3070940 kB' 'Inactive: 2648092 kB' 'Active(anon): 2941172 kB' 'Inactive(anon): 2342652 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5338984 kB' 'Mapped: 65000 kB' 'AnonPages: 380248 kB' 'Shmem: 4903744 kB' 'KernelStack: 11064 kB' 'PageTables: 4476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 180452 kB' 'Slab: 699240 kB' 'SReclaimable: 180452 kB' 'SUnreclaim: 518788 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # continue 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:22.138 07:29:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:22.138 07:29:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.138 07:29:32 -- setup/common.sh@33 -- # echo 0 00:04:22.138 07:29:32 -- setup/common.sh@33 -- # return 0 00:04:22.138 07:29:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:22.138 07:29:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:22.138 07:29:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:22.139 07:29:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:22.139 07:29:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:22.139 node0=512 expecting 512 00:04:22.139 07:29:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:22.139 07:29:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:22.139 07:29:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:22.139 07:29:32 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:22.139 node1=1024 expecting 1024 00:04:22.139 07:29:32 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:22.139 00:04:22.139 real 0m3.076s 00:04:22.139 user 0m1.015s 00:04:22.139 sys 0m1.914s 00:04:22.139 07:29:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:22.139 07:29:32 -- common/autotest_common.sh@10 -- # set +x 00:04:22.139 ************************************ 00:04:22.139 END TEST custom_alloc 00:04:22.139 ************************************ 00:04:22.139 07:29:32 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:22.139 07:29:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:22.139 07:29:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:22.139 07:29:32 -- common/autotest_common.sh@10 -- # set +x 00:04:22.139 ************************************ 00:04:22.139 START TEST no_shrink_alloc 00:04:22.139 ************************************ 00:04:22.139 07:29:32 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:22.139 07:29:32 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:22.139 07:29:32 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:22.139 07:29:32 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:22.139 07:29:32 -- setup/hugepages.sh@51 -- # shift 00:04:22.139 07:29:32 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:22.139 07:29:32 -- setup/hugepages.sh@52 -- # local node_ids 00:04:22.139 07:29:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:22.139 07:29:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:22.139 07:29:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:22.139 07:29:32 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:22.139 07:29:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:22.139 07:29:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:22.139 07:29:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:22.139 07:29:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:22.139 07:29:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:22.139 07:29:32 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:22.139 07:29:32 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:22.139 07:29:32 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:22.139 07:29:32 -- setup/hugepages.sh@73 -- # return 0 00:04:22.139 07:29:32 -- setup/hugepages.sh@198 -- # setup output 00:04:22.139 07:29:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.139 07:29:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:26.339 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:26.339 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:26.339 07:29:36 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:26.339 07:29:36 -- setup/hugepages.sh@89 -- # local node 00:04:26.339 07:29:36 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:26.339 07:29:36 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:26.339 07:29:36 -- setup/hugepages.sh@92 -- # local surp 00:04:26.339 07:29:36 -- setup/hugepages.sh@93 -- # local resv 00:04:26.339 07:29:36 -- setup/hugepages.sh@94 -- # local anon 00:04:26.339 07:29:36 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:26.339 07:29:36 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:26.339 07:29:36 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:26.339 07:29:36 -- setup/common.sh@18 -- # local node= 00:04:26.340 07:29:36 -- setup/common.sh@19 -- # local var val 00:04:26.340 07:29:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.340 07:29:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.340 07:29:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.340 07:29:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.340 07:29:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.340 07:29:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42277880 kB' 'MemAvailable: 43907032 kB' 'Buffers: 6784 kB' 'Cached: 10645208 kB' 'SwapCached: 76 kB' 'Active: 8058060 kB' 'Inactive: 3183352 kB' 'Active(anon): 7150732 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592744 kB' 'Mapped: 167560 kB' 'Shmem: 8904020 kB' 'KReclaimable: 577980 kB' 'Slab: 1581408 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003428 kB' 'KernelStack: 22048 kB' 'PageTables: 8948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11422864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.340 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.340 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:26.341 07:29:36 -- setup/common.sh@33 -- # echo 0 00:04:26.341 07:29:36 -- setup/common.sh@33 -- # return 0 00:04:26.341 07:29:36 -- setup/hugepages.sh@97 -- # anon=0 00:04:26.341 07:29:36 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:26.341 07:29:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.341 07:29:36 -- setup/common.sh@18 -- # local node= 00:04:26.341 07:29:36 -- setup/common.sh@19 -- # local var val 00:04:26.341 07:29:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.341 07:29:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.341 07:29:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.341 07:29:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.341 07:29:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.341 07:29:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42277740 kB' 'MemAvailable: 43906892 kB' 'Buffers: 6784 kB' 'Cached: 10645212 kB' 'SwapCached: 76 kB' 'Active: 8057932 kB' 'Inactive: 3183352 kB' 'Active(anon): 7150604 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592592 kB' 'Mapped: 167560 kB' 'Shmem: 8904024 kB' 'KReclaimable: 577980 kB' 'Slab: 1581516 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003536 kB' 'KernelStack: 21872 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11422876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.341 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.341 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.342 07:29:36 -- setup/common.sh@33 -- # echo 0 00:04:26.342 07:29:36 -- setup/common.sh@33 -- # return 0 00:04:26.342 07:29:36 -- setup/hugepages.sh@99 -- # surp=0 00:04:26.342 07:29:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:26.342 07:29:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:26.342 07:29:36 -- setup/common.sh@18 -- # local node= 00:04:26.342 07:29:36 -- setup/common.sh@19 -- # local var val 00:04:26.342 07:29:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.342 07:29:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.342 07:29:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.342 07:29:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.342 07:29:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.342 07:29:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42277128 kB' 'MemAvailable: 43906280 kB' 'Buffers: 6784 kB' 'Cached: 10645224 kB' 'SwapCached: 76 kB' 'Active: 8057640 kB' 'Inactive: 3183352 kB' 'Active(anon): 7150312 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592228 kB' 'Mapped: 167560 kB' 'Shmem: 8904036 kB' 'KReclaimable: 577980 kB' 'Slab: 1581516 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003536 kB' 'KernelStack: 22032 kB' 'PageTables: 9000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11421376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.342 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.342 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.343 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.343 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:26.344 07:29:36 -- setup/common.sh@33 -- # echo 0 00:04:26.344 07:29:36 -- setup/common.sh@33 -- # return 0 00:04:26.344 07:29:36 -- setup/hugepages.sh@100 -- # resv=0 00:04:26.344 07:29:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:26.344 nr_hugepages=1024 00:04:26.344 07:29:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:26.344 resv_hugepages=0 00:04:26.344 07:29:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:26.344 surplus_hugepages=0 00:04:26.344 07:29:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:26.344 anon_hugepages=0 00:04:26.344 07:29:36 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.344 07:29:36 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:26.344 07:29:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:26.344 07:29:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:26.344 07:29:36 -- setup/common.sh@18 -- # local node= 00:04:26.344 07:29:36 -- setup/common.sh@19 -- # local var val 00:04:26.344 07:29:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.344 07:29:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.344 07:29:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:26.344 07:29:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:26.344 07:29:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.344 07:29:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42276908 kB' 'MemAvailable: 43906060 kB' 'Buffers: 6784 kB' 'Cached: 10645236 kB' 'SwapCached: 76 kB' 'Active: 8057560 kB' 'Inactive: 3183352 kB' 'Active(anon): 7150232 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592132 kB' 'Mapped: 167560 kB' 'Shmem: 8904048 kB' 'KReclaimable: 577980 kB' 'Slab: 1581516 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003536 kB' 'KernelStack: 21952 kB' 'PageTables: 8732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11422904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.344 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.344 07:29:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:26.345 07:29:36 -- setup/common.sh@33 -- # echo 1024 00:04:26.345 07:29:36 -- setup/common.sh@33 -- # return 0 00:04:26.345 07:29:36 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:26.345 07:29:36 -- setup/hugepages.sh@112 -- # get_nodes 00:04:26.345 07:29:36 -- setup/hugepages.sh@27 -- # local node 00:04:26.345 07:29:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.345 07:29:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:26.345 07:29:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:26.345 07:29:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:26.345 07:29:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:26.345 07:29:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:26.345 07:29:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:26.345 07:29:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:26.345 07:29:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:26.345 07:29:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:26.345 07:29:36 -- setup/common.sh@18 -- # local node=0 00:04:26.345 07:29:36 -- setup/common.sh@19 -- # local var val 00:04:26.345 07:29:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:26.345 07:29:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:26.345 07:29:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:26.345 07:29:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:26.345 07:29:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:26.345 07:29:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22634320 kB' 'MemUsed: 10000116 kB' 'SwapCached: 44 kB' 'Active: 4986676 kB' 'Inactive: 535260 kB' 'Active(anon): 4209116 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5313036 kB' 'Mapped: 102552 kB' 'AnonPages: 212116 kB' 'Shmem: 4000228 kB' 'KernelStack: 10872 kB' 'PageTables: 4472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397528 kB' 'Slab: 882128 kB' 'SReclaimable: 397528 kB' 'SUnreclaim: 484600 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.345 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.345 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # continue 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:26.346 07:29:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:26.346 07:29:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:26.346 07:29:36 -- setup/common.sh@33 -- # echo 0 00:04:26.346 07:29:36 -- setup/common.sh@33 -- # return 0 00:04:26.346 07:29:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:26.346 07:29:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:26.346 07:29:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:26.346 07:29:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:26.346 07:29:36 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:26.346 node0=1024 expecting 1024 00:04:26.346 07:29:36 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:26.346 07:29:36 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:26.346 07:29:36 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:26.346 07:29:36 -- setup/hugepages.sh@202 -- # setup output 00:04:26.346 07:29:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.346 07:29:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:29.645 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:29.645 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:29.645 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:29.645 07:29:40 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:29.645 07:29:40 -- setup/hugepages.sh@89 -- # local node 00:04:29.645 07:29:40 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:29.645 07:29:40 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:29.645 07:29:40 -- setup/hugepages.sh@92 -- # local surp 00:04:29.645 07:29:40 -- setup/hugepages.sh@93 -- # local resv 00:04:29.645 07:29:40 -- setup/hugepages.sh@94 -- # local anon 00:04:29.645 07:29:40 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:29.645 07:29:40 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:29.645 07:29:40 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:29.645 07:29:40 -- setup/common.sh@18 -- # local node= 00:04:29.645 07:29:40 -- setup/common.sh@19 -- # local var val 00:04:29.645 07:29:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:29.645 07:29:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.645 07:29:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.645 07:29:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.645 07:29:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.645 07:29:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.645 07:29:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42303564 kB' 'MemAvailable: 43932716 kB' 'Buffers: 6784 kB' 'Cached: 10645324 kB' 'SwapCached: 76 kB' 'Active: 8058992 kB' 'Inactive: 3183352 kB' 'Active(anon): 7151664 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593436 kB' 'Mapped: 167604 kB' 'Shmem: 8904136 kB' 'KReclaimable: 577980 kB' 'Slab: 1581316 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003336 kB' 'KernelStack: 21840 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11418952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.645 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.645 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.646 07:29:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.646 07:29:40 -- setup/common.sh@33 -- # echo 0 00:04:29.646 07:29:40 -- setup/common.sh@33 -- # return 0 00:04:29.646 07:29:40 -- setup/hugepages.sh@97 -- # anon=0 00:04:29.646 07:29:40 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:29.646 07:29:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.646 07:29:40 -- setup/common.sh@18 -- # local node= 00:04:29.646 07:29:40 -- setup/common.sh@19 -- # local var val 00:04:29.646 07:29:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:29.646 07:29:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.646 07:29:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.646 07:29:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.646 07:29:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.646 07:29:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.646 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42309372 kB' 'MemAvailable: 43938524 kB' 'Buffers: 6784 kB' 'Cached: 10645328 kB' 'SwapCached: 76 kB' 'Active: 8060844 kB' 'Inactive: 3183352 kB' 'Active(anon): 7153516 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595800 kB' 'Mapped: 167564 kB' 'Shmem: 8904140 kB' 'KReclaimable: 577980 kB' 'Slab: 1581348 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003368 kB' 'KernelStack: 21824 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11418964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.647 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.647 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.648 07:29:40 -- setup/common.sh@33 -- # echo 0 00:04:29.648 07:29:40 -- setup/common.sh@33 -- # return 0 00:04:29.648 07:29:40 -- setup/hugepages.sh@99 -- # surp=0 00:04:29.648 07:29:40 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:29.648 07:29:40 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:29.648 07:29:40 -- setup/common.sh@18 -- # local node= 00:04:29.648 07:29:40 -- setup/common.sh@19 -- # local var val 00:04:29.648 07:29:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:29.648 07:29:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.648 07:29:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.648 07:29:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.648 07:29:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.648 07:29:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42308392 kB' 'MemAvailable: 43937544 kB' 'Buffers: 6784 kB' 'Cached: 10645340 kB' 'SwapCached: 76 kB' 'Active: 8061024 kB' 'Inactive: 3183352 kB' 'Active(anon): 7153696 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596060 kB' 'Mapped: 167564 kB' 'Shmem: 8904152 kB' 'KReclaimable: 577980 kB' 'Slab: 1581340 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003360 kB' 'KernelStack: 21824 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11418980 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.648 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.648 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.649 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.649 07:29:40 -- setup/common.sh@33 -- # echo 0 00:04:29.649 07:29:40 -- setup/common.sh@33 -- # return 0 00:04:29.649 07:29:40 -- setup/hugepages.sh@100 -- # resv=0 00:04:29.649 07:29:40 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:29.649 nr_hugepages=1024 00:04:29.649 07:29:40 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:29.649 resv_hugepages=0 00:04:29.649 07:29:40 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:29.649 surplus_hugepages=0 00:04:29.649 07:29:40 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:29.649 anon_hugepages=0 00:04:29.649 07:29:40 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:29.649 07:29:40 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:29.649 07:29:40 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:29.649 07:29:40 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:29.649 07:29:40 -- setup/common.sh@18 -- # local node= 00:04:29.649 07:29:40 -- setup/common.sh@19 -- # local var val 00:04:29.649 07:29:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:29.649 07:29:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.649 07:29:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.649 07:29:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.649 07:29:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.649 07:29:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.649 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42308912 kB' 'MemAvailable: 43938064 kB' 'Buffers: 6784 kB' 'Cached: 10645352 kB' 'SwapCached: 76 kB' 'Active: 8061188 kB' 'Inactive: 3183352 kB' 'Active(anon): 7153860 kB' 'Inactive(anon): 2342708 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596176 kB' 'Mapped: 167564 kB' 'Shmem: 8904164 kB' 'KReclaimable: 577980 kB' 'Slab: 1581340 kB' 'SReclaimable: 577980 kB' 'SUnreclaim: 1003360 kB' 'KernelStack: 21824 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11418992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.650 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.650 07:29:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.651 07:29:40 -- setup/common.sh@33 -- # echo 1024 00:04:29.651 07:29:40 -- setup/common.sh@33 -- # return 0 00:04:29.651 07:29:40 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:29.651 07:29:40 -- setup/hugepages.sh@112 -- # get_nodes 00:04:29.651 07:29:40 -- setup/hugepages.sh@27 -- # local node 00:04:29.651 07:29:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.651 07:29:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:29.651 07:29:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.651 07:29:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:29.651 07:29:40 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:29.651 07:29:40 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:29.651 07:29:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:29.651 07:29:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:29.651 07:29:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:29.651 07:29:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.651 07:29:40 -- setup/common.sh@18 -- # local node=0 00:04:29.651 07:29:40 -- setup/common.sh@19 -- # local var val 00:04:29.651 07:29:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:29.651 07:29:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.651 07:29:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:29.651 07:29:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:29.651 07:29:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.651 07:29:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22657316 kB' 'MemUsed: 9977120 kB' 'SwapCached: 44 kB' 'Active: 4988364 kB' 'Inactive: 535260 kB' 'Active(anon): 4210804 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5313064 kB' 'Mapped: 102552 kB' 'AnonPages: 214104 kB' 'Shmem: 4000256 kB' 'KernelStack: 10728 kB' 'PageTables: 3980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 397528 kB' 'Slab: 881756 kB' 'SReclaimable: 397528 kB' 'SUnreclaim: 484228 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.651 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.651 07:29:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # continue 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:29.652 07:29:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:29.652 07:29:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.652 07:29:40 -- setup/common.sh@33 -- # echo 0 00:04:29.652 07:29:40 -- setup/common.sh@33 -- # return 0 00:04:29.652 07:29:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:29.652 07:29:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:29.652 07:29:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:29.652 07:29:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:29.652 07:29:40 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:29.652 node0=1024 expecting 1024 00:04:29.652 07:29:40 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:29.652 00:04:29.652 real 0m7.267s 00:04:29.652 user 0m2.723s 00:04:29.652 sys 0m4.666s 00:04:29.652 07:29:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:29.652 07:29:40 -- common/autotest_common.sh@10 -- # set +x 00:04:29.652 ************************************ 00:04:29.652 END TEST no_shrink_alloc 00:04:29.652 ************************************ 00:04:29.652 07:29:40 -- setup/hugepages.sh@217 -- # clear_hp 00:04:29.652 07:29:40 -- setup/hugepages.sh@37 -- # local node hp 00:04:29.652 07:29:40 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:29.652 07:29:40 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:29.652 07:29:40 -- setup/hugepages.sh@41 -- # echo 0 00:04:29.652 07:29:40 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:29.652 07:29:40 -- setup/hugepages.sh@41 -- # echo 0 00:04:29.652 07:29:40 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:29.652 07:29:40 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:29.652 07:29:40 -- setup/hugepages.sh@41 -- # echo 0 00:04:29.652 07:29:40 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:29.652 07:29:40 -- setup/hugepages.sh@41 -- # echo 0 00:04:29.652 07:29:40 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:29.652 07:29:40 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:29.652 00:04:29.652 real 0m25.600s 00:04:29.652 user 0m8.511s 00:04:29.652 sys 0m15.493s 00:04:29.652 07:29:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:29.652 07:29:40 -- common/autotest_common.sh@10 -- # set +x 00:04:29.652 ************************************ 00:04:29.652 END TEST hugepages 00:04:29.652 ************************************ 00:04:29.652 07:29:40 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:29.652 07:29:40 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:29.652 07:29:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:29.652 07:29:40 -- common/autotest_common.sh@10 -- # set +x 00:04:29.652 ************************************ 00:04:29.652 START TEST driver 00:04:29.652 ************************************ 00:04:29.652 07:29:40 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:29.652 * Looking for test storage... 00:04:29.652 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:29.652 07:29:40 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:29.652 07:29:40 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:29.652 07:29:40 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:29.912 07:29:40 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:29.912 07:29:40 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:29.912 07:29:40 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:29.912 07:29:40 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:29.912 07:29:40 -- scripts/common.sh@335 -- # IFS=.-: 00:04:29.913 07:29:40 -- scripts/common.sh@335 -- # read -ra ver1 00:04:29.913 07:29:40 -- scripts/common.sh@336 -- # IFS=.-: 00:04:29.913 07:29:40 -- scripts/common.sh@336 -- # read -ra ver2 00:04:29.913 07:29:40 -- scripts/common.sh@337 -- # local 'op=<' 00:04:29.913 07:29:40 -- scripts/common.sh@339 -- # ver1_l=2 00:04:29.913 07:29:40 -- scripts/common.sh@340 -- # ver2_l=1 00:04:29.913 07:29:40 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:29.913 07:29:40 -- scripts/common.sh@343 -- # case "$op" in 00:04:29.913 07:29:40 -- scripts/common.sh@344 -- # : 1 00:04:29.913 07:29:40 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:29.913 07:29:40 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:29.913 07:29:40 -- scripts/common.sh@364 -- # decimal 1 00:04:29.913 07:29:40 -- scripts/common.sh@352 -- # local d=1 00:04:29.913 07:29:40 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:29.913 07:29:40 -- scripts/common.sh@354 -- # echo 1 00:04:29.913 07:29:40 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:29.913 07:29:40 -- scripts/common.sh@365 -- # decimal 2 00:04:29.913 07:29:40 -- scripts/common.sh@352 -- # local d=2 00:04:29.913 07:29:40 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:29.913 07:29:40 -- scripts/common.sh@354 -- # echo 2 00:04:29.913 07:29:40 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:29.913 07:29:40 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:29.913 07:29:40 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:29.913 07:29:40 -- scripts/common.sh@367 -- # return 0 00:04:29.913 07:29:40 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:29.913 07:29:40 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:29.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:29.913 --rc genhtml_branch_coverage=1 00:04:29.913 --rc genhtml_function_coverage=1 00:04:29.913 --rc genhtml_legend=1 00:04:29.913 --rc geninfo_all_blocks=1 00:04:29.913 --rc geninfo_unexecuted_blocks=1 00:04:29.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:29.913 ' 00:04:29.913 07:29:40 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:29.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:29.913 --rc genhtml_branch_coverage=1 00:04:29.913 --rc genhtml_function_coverage=1 00:04:29.913 --rc genhtml_legend=1 00:04:29.913 --rc geninfo_all_blocks=1 00:04:29.913 --rc geninfo_unexecuted_blocks=1 00:04:29.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:29.913 ' 00:04:29.913 07:29:40 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:29.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:29.913 --rc genhtml_branch_coverage=1 00:04:29.913 --rc genhtml_function_coverage=1 00:04:29.913 --rc genhtml_legend=1 00:04:29.913 --rc geninfo_all_blocks=1 00:04:29.913 --rc geninfo_unexecuted_blocks=1 00:04:29.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:29.913 ' 00:04:29.913 07:29:40 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:29.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:29.913 --rc genhtml_branch_coverage=1 00:04:29.913 --rc genhtml_function_coverage=1 00:04:29.913 --rc genhtml_legend=1 00:04:29.913 --rc geninfo_all_blocks=1 00:04:29.913 --rc geninfo_unexecuted_blocks=1 00:04:29.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:29.913 ' 00:04:29.913 07:29:40 -- setup/driver.sh@68 -- # setup reset 00:04:29.913 07:29:40 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:29.913 07:29:40 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:35.198 07:29:45 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:35.199 07:29:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:35.199 07:29:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:35.199 07:29:45 -- common/autotest_common.sh@10 -- # set +x 00:04:35.199 ************************************ 00:04:35.199 START TEST guess_driver 00:04:35.199 ************************************ 00:04:35.199 07:29:45 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:35.199 07:29:45 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:35.199 07:29:45 -- setup/driver.sh@47 -- # local fail=0 00:04:35.199 07:29:45 -- setup/driver.sh@49 -- # pick_driver 00:04:35.199 07:29:45 -- setup/driver.sh@36 -- # vfio 00:04:35.199 07:29:45 -- setup/driver.sh@21 -- # local iommu_grups 00:04:35.199 07:29:45 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:35.199 07:29:45 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:35.199 07:29:45 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:35.199 07:29:45 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:35.199 07:29:45 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:35.199 07:29:45 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:35.199 07:29:45 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:35.199 07:29:45 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:35.199 07:29:45 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:35.199 07:29:45 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:35.199 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:35.199 07:29:45 -- setup/driver.sh@30 -- # return 0 00:04:35.199 07:29:45 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:35.199 07:29:45 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:35.199 07:29:45 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:35.199 07:29:45 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:35.199 Looking for driver=vfio-pci 00:04:35.199 07:29:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:35.199 07:29:45 -- setup/driver.sh@45 -- # setup output config 00:04:35.199 07:29:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:35.199 07:29:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:37.791 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:37.791 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:37.791 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:37.791 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:37.791 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:37.791 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:37.791 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:37.791 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:37.791 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:37.791 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:37.791 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:37.791 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.050 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.050 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.050 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:38.051 07:29:48 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:38.051 07:29:48 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:38.051 07:29:48 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:39.432 07:29:50 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:39.432 07:29:50 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:39.432 07:29:50 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:39.691 07:29:50 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:39.691 07:29:50 -- setup/driver.sh@65 -- # setup reset 00:04:39.691 07:29:50 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:39.691 07:29:50 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:44.972 00:04:44.972 real 0m9.602s 00:04:44.972 user 0m2.584s 00:04:44.972 sys 0m4.831s 00:04:44.972 07:29:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:44.972 07:29:54 -- common/autotest_common.sh@10 -- # set +x 00:04:44.972 ************************************ 00:04:44.972 END TEST guess_driver 00:04:44.972 ************************************ 00:04:44.972 00:04:44.972 real 0m14.552s 00:04:44.972 user 0m4.023s 00:04:44.972 sys 0m7.592s 00:04:44.972 07:29:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:44.972 07:29:54 -- common/autotest_common.sh@10 -- # set +x 00:04:44.972 ************************************ 00:04:44.972 END TEST driver 00:04:44.972 ************************************ 00:04:44.972 07:29:54 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:44.972 07:29:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:44.972 07:29:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:44.972 07:29:54 -- common/autotest_common.sh@10 -- # set +x 00:04:44.972 ************************************ 00:04:44.972 START TEST devices 00:04:44.972 ************************************ 00:04:44.972 07:29:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:44.972 * Looking for test storage... 00:04:44.972 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:44.972 07:29:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:44.972 07:29:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:44.972 07:29:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:44.972 07:29:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:44.972 07:29:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:44.972 07:29:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:44.972 07:29:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:44.972 07:29:55 -- scripts/common.sh@335 -- # IFS=.-: 00:04:44.972 07:29:55 -- scripts/common.sh@335 -- # read -ra ver1 00:04:44.972 07:29:55 -- scripts/common.sh@336 -- # IFS=.-: 00:04:44.972 07:29:55 -- scripts/common.sh@336 -- # read -ra ver2 00:04:44.972 07:29:55 -- scripts/common.sh@337 -- # local 'op=<' 00:04:44.972 07:29:55 -- scripts/common.sh@339 -- # ver1_l=2 00:04:44.972 07:29:55 -- scripts/common.sh@340 -- # ver2_l=1 00:04:44.972 07:29:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:44.972 07:29:55 -- scripts/common.sh@343 -- # case "$op" in 00:04:44.972 07:29:55 -- scripts/common.sh@344 -- # : 1 00:04:44.972 07:29:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:44.972 07:29:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:44.972 07:29:55 -- scripts/common.sh@364 -- # decimal 1 00:04:44.972 07:29:55 -- scripts/common.sh@352 -- # local d=1 00:04:44.972 07:29:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:44.972 07:29:55 -- scripts/common.sh@354 -- # echo 1 00:04:44.972 07:29:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:44.972 07:29:55 -- scripts/common.sh@365 -- # decimal 2 00:04:44.972 07:29:55 -- scripts/common.sh@352 -- # local d=2 00:04:44.972 07:29:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:44.972 07:29:55 -- scripts/common.sh@354 -- # echo 2 00:04:44.972 07:29:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:44.972 07:29:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:44.972 07:29:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:44.972 07:29:55 -- scripts/common.sh@367 -- # return 0 00:04:44.972 07:29:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:44.972 07:29:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:44.972 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.972 --rc genhtml_branch_coverage=1 00:04:44.972 --rc genhtml_function_coverage=1 00:04:44.972 --rc genhtml_legend=1 00:04:44.972 --rc geninfo_all_blocks=1 00:04:44.972 --rc geninfo_unexecuted_blocks=1 00:04:44.972 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.972 ' 00:04:44.972 07:29:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:44.972 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.972 --rc genhtml_branch_coverage=1 00:04:44.972 --rc genhtml_function_coverage=1 00:04:44.972 --rc genhtml_legend=1 00:04:44.972 --rc geninfo_all_blocks=1 00:04:44.972 --rc geninfo_unexecuted_blocks=1 00:04:44.972 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.972 ' 00:04:44.972 07:29:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:44.972 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.972 --rc genhtml_branch_coverage=1 00:04:44.972 --rc genhtml_function_coverage=1 00:04:44.972 --rc genhtml_legend=1 00:04:44.972 --rc geninfo_all_blocks=1 00:04:44.972 --rc geninfo_unexecuted_blocks=1 00:04:44.972 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.972 ' 00:04:44.972 07:29:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:44.972 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.972 --rc genhtml_branch_coverage=1 00:04:44.972 --rc genhtml_function_coverage=1 00:04:44.972 --rc genhtml_legend=1 00:04:44.972 --rc geninfo_all_blocks=1 00:04:44.972 --rc geninfo_unexecuted_blocks=1 00:04:44.972 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.972 ' 00:04:44.972 07:29:55 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:44.972 07:29:55 -- setup/devices.sh@192 -- # setup reset 00:04:44.972 07:29:55 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:44.972 07:29:55 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:48.267 07:29:58 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:48.267 07:29:58 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:48.267 07:29:58 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:48.267 07:29:58 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:48.267 07:29:58 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:48.267 07:29:58 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:48.267 07:29:58 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:48.267 07:29:58 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:48.267 07:29:58 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:48.267 07:29:58 -- setup/devices.sh@196 -- # blocks=() 00:04:48.267 07:29:58 -- setup/devices.sh@196 -- # declare -a blocks 00:04:48.267 07:29:58 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:48.267 07:29:58 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:48.267 07:29:58 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:48.267 07:29:58 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:48.267 07:29:58 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:48.267 07:29:58 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:48.267 07:29:58 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:48.267 07:29:58 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:48.267 07:29:58 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:48.267 07:29:58 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:48.267 07:29:58 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:48.267 No valid GPT data, bailing 00:04:48.267 07:29:58 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:48.267 07:29:58 -- scripts/common.sh@393 -- # pt= 00:04:48.267 07:29:58 -- scripts/common.sh@394 -- # return 1 00:04:48.267 07:29:58 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:48.267 07:29:58 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:48.267 07:29:58 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:48.267 07:29:58 -- setup/common.sh@80 -- # echo 1600321314816 00:04:48.267 07:29:58 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:48.267 07:29:58 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:48.267 07:29:58 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:48.267 07:29:58 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:48.267 07:29:58 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:48.267 07:29:58 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:48.267 07:29:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:48.267 07:29:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:48.267 07:29:58 -- common/autotest_common.sh@10 -- # set +x 00:04:48.267 ************************************ 00:04:48.267 START TEST nvme_mount 00:04:48.267 ************************************ 00:04:48.267 07:29:58 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:48.267 07:29:58 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:48.267 07:29:58 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:48.267 07:29:58 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.267 07:29:58 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:48.267 07:29:58 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:48.267 07:29:58 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:48.267 07:29:58 -- setup/common.sh@40 -- # local part_no=1 00:04:48.267 07:29:58 -- setup/common.sh@41 -- # local size=1073741824 00:04:48.267 07:29:58 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:48.267 07:29:58 -- setup/common.sh@44 -- # parts=() 00:04:48.267 07:29:58 -- setup/common.sh@44 -- # local parts 00:04:48.267 07:29:58 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:48.267 07:29:58 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:48.267 07:29:58 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:48.267 07:29:58 -- setup/common.sh@46 -- # (( part++ )) 00:04:48.267 07:29:58 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:48.267 07:29:58 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:48.267 07:29:58 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:48.267 07:29:58 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:49.206 Creating new GPT entries in memory. 00:04:49.206 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:49.206 other utilities. 00:04:49.206 07:29:59 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:49.206 07:29:59 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:49.206 07:29:59 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:49.206 07:29:59 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:49.206 07:29:59 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:50.144 Creating new GPT entries in memory. 00:04:50.144 The operation has completed successfully. 00:04:50.144 07:30:00 -- setup/common.sh@57 -- # (( part++ )) 00:04:50.144 07:30:00 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:50.144 07:30:00 -- setup/common.sh@62 -- # wait 1626951 00:04:50.144 07:30:00 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.144 07:30:00 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:50.144 07:30:00 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.144 07:30:00 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:50.144 07:30:00 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:50.144 07:30:00 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.144 07:30:00 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:50.144 07:30:00 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:50.144 07:30:00 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:50.144 07:30:00 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.144 07:30:00 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:50.144 07:30:00 -- setup/devices.sh@53 -- # local found=0 00:04:50.144 07:30:00 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:50.144 07:30:00 -- setup/devices.sh@56 -- # : 00:04:50.144 07:30:00 -- setup/devices.sh@59 -- # local pci status 00:04:50.144 07:30:00 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.144 07:30:00 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:50.144 07:30:00 -- setup/devices.sh@47 -- # setup output config 00:04:50.144 07:30:00 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.144 07:30:00 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:53.436 07:30:04 -- setup/devices.sh@63 -- # found=1 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.436 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.436 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.437 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.437 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.437 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.437 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.437 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.437 07:30:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.437 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.696 07:30:04 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:53.696 07:30:04 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:53.696 07:30:04 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.696 07:30:04 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:53.696 07:30:04 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.696 07:30:04 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:53.696 07:30:04 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.696 07:30:04 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.696 07:30:04 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.697 07:30:04 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:53.697 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:53.697 07:30:04 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.697 07:30:04 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.956 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:53.956 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:53.956 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:53.956 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:53.956 07:30:04 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:53.956 07:30:04 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:53.956 07:30:04 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.956 07:30:04 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:53.956 07:30:04 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:53.956 07:30:04 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.956 07:30:04 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.956 07:30:04 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:53.956 07:30:04 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:53.956 07:30:04 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.956 07:30:04 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:53.956 07:30:04 -- setup/devices.sh@53 -- # local found=0 00:04:53.956 07:30:04 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:53.956 07:30:04 -- setup/devices.sh@56 -- # : 00:04:53.956 07:30:04 -- setup/devices.sh@59 -- # local pci status 00:04:53.956 07:30:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.956 07:30:04 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:53.956 07:30:04 -- setup/devices.sh@47 -- # setup output config 00:04:53.956 07:30:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:53.956 07:30:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:57.252 07:30:07 -- setup/devices.sh@63 -- # found=1 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:57.252 07:30:07 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:57.252 07:30:07 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:57.252 07:30:07 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:57.252 07:30:07 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:57.252 07:30:07 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:57.252 07:30:07 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:57.252 07:30:07 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:57.252 07:30:07 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:57.252 07:30:07 -- setup/devices.sh@50 -- # local mount_point= 00:04:57.252 07:30:07 -- setup/devices.sh@51 -- # local test_file= 00:04:57.252 07:30:07 -- setup/devices.sh@53 -- # local found=0 00:04:57.252 07:30:07 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:57.252 07:30:07 -- setup/devices.sh@59 -- # local pci status 00:04:57.252 07:30:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.252 07:30:07 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:57.252 07:30:07 -- setup/devices.sh@47 -- # setup output config 00:04:57.252 07:30:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:57.252 07:30:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:00.545 07:30:11 -- setup/devices.sh@63 -- # found=1 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.545 07:30:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.545 07:30:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.804 07:30:11 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:00.804 07:30:11 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:00.804 07:30:11 -- setup/devices.sh@68 -- # return 0 00:05:00.804 07:30:11 -- setup/devices.sh@128 -- # cleanup_nvme 00:05:00.804 07:30:11 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:00.805 07:30:11 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:00.805 07:30:11 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:00.805 07:30:11 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:00.805 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:00.805 00:05:00.805 real 0m12.632s 00:05:00.805 user 0m3.749s 00:05:00.805 sys 0m6.845s 00:05:00.805 07:30:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.805 07:30:11 -- common/autotest_common.sh@10 -- # set +x 00:05:00.805 ************************************ 00:05:00.805 END TEST nvme_mount 00:05:00.805 ************************************ 00:05:00.805 07:30:11 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:00.805 07:30:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:00.805 07:30:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:00.805 07:30:11 -- common/autotest_common.sh@10 -- # set +x 00:05:00.805 ************************************ 00:05:00.805 START TEST dm_mount 00:05:00.805 ************************************ 00:05:00.805 07:30:11 -- common/autotest_common.sh@1114 -- # dm_mount 00:05:00.805 07:30:11 -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:00.805 07:30:11 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:00.805 07:30:11 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:00.805 07:30:11 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:00.805 07:30:11 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:00.805 07:30:11 -- setup/common.sh@40 -- # local part_no=2 00:05:00.805 07:30:11 -- setup/common.sh@41 -- # local size=1073741824 00:05:00.805 07:30:11 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:00.805 07:30:11 -- setup/common.sh@44 -- # parts=() 00:05:00.805 07:30:11 -- setup/common.sh@44 -- # local parts 00:05:00.805 07:30:11 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:00.805 07:30:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.805 07:30:11 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:00.805 07:30:11 -- setup/common.sh@46 -- # (( part++ )) 00:05:00.805 07:30:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.805 07:30:11 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:00.805 07:30:11 -- setup/common.sh@46 -- # (( part++ )) 00:05:00.805 07:30:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:00.805 07:30:11 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:00.805 07:30:11 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:00.805 07:30:11 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:01.742 Creating new GPT entries in memory. 00:05:01.742 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:01.742 other utilities. 00:05:01.742 07:30:12 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:01.742 07:30:12 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.742 07:30:12 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:01.742 07:30:12 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:01.742 07:30:12 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:02.680 Creating new GPT entries in memory. 00:05:02.680 The operation has completed successfully. 00:05:02.680 07:30:13 -- setup/common.sh@57 -- # (( part++ )) 00:05:02.680 07:30:13 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:02.680 07:30:13 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:02.680 07:30:13 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:02.680 07:30:13 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:04.061 The operation has completed successfully. 00:05:04.061 07:30:14 -- setup/common.sh@57 -- # (( part++ )) 00:05:04.061 07:30:14 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:04.061 07:30:14 -- setup/common.sh@62 -- # wait 1632033 00:05:04.061 07:30:14 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:04.061 07:30:14 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.061 07:30:14 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.061 07:30:14 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:04.061 07:30:14 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:04.061 07:30:14 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:04.061 07:30:14 -- setup/devices.sh@161 -- # break 00:05:04.061 07:30:14 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:04.061 07:30:14 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:04.061 07:30:14 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:04.061 07:30:14 -- setup/devices.sh@166 -- # dm=dm-0 00:05:04.061 07:30:14 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:04.061 07:30:14 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:04.061 07:30:14 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.061 07:30:14 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:04.061 07:30:14 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.061 07:30:14 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:04.061 07:30:14 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:04.061 07:30:14 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.061 07:30:14 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.061 07:30:14 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:04.061 07:30:14 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:04.061 07:30:14 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.061 07:30:14 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.061 07:30:14 -- setup/devices.sh@53 -- # local found=0 00:05:04.061 07:30:14 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:04.061 07:30:14 -- setup/devices.sh@56 -- # : 00:05:04.061 07:30:14 -- setup/devices.sh@59 -- # local pci status 00:05:04.061 07:30:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.061 07:30:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:04.061 07:30:14 -- setup/devices.sh@47 -- # setup output config 00:05:04.061 07:30:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.061 07:30:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:06.594 07:30:17 -- setup/devices.sh@63 -- # found=1 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.594 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.594 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.853 07:30:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:06.853 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.853 07:30:17 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:06.853 07:30:17 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:06.853 07:30:17 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:06.853 07:30:17 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:06.853 07:30:17 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:06.853 07:30:17 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:06.853 07:30:17 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:06.853 07:30:17 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:06.853 07:30:17 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:06.853 07:30:17 -- setup/devices.sh@50 -- # local mount_point= 00:05:06.853 07:30:17 -- setup/devices.sh@51 -- # local test_file= 00:05:06.853 07:30:17 -- setup/devices.sh@53 -- # local found=0 00:05:06.853 07:30:17 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:06.853 07:30:17 -- setup/devices.sh@59 -- # local pci status 00:05:06.853 07:30:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.853 07:30:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:06.853 07:30:17 -- setup/devices.sh@47 -- # setup output config 00:05:06.853 07:30:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.853 07:30:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:10.147 07:30:20 -- setup/devices.sh@63 -- # found=1 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.147 07:30:20 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:10.147 07:30:20 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:10.147 07:30:20 -- setup/devices.sh@68 -- # return 0 00:05:10.147 07:30:20 -- setup/devices.sh@187 -- # cleanup_dm 00:05:10.147 07:30:20 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:10.147 07:30:20 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:10.147 07:30:20 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:10.147 07:30:20 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:10.147 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:10.147 07:30:20 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:10.147 00:05:10.147 real 0m9.359s 00:05:10.147 user 0m2.110s 00:05:10.147 sys 0m4.085s 00:05:10.147 07:30:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:10.147 07:30:20 -- common/autotest_common.sh@10 -- # set +x 00:05:10.147 ************************************ 00:05:10.147 END TEST dm_mount 00:05:10.147 ************************************ 00:05:10.147 07:30:20 -- setup/devices.sh@1 -- # cleanup 00:05:10.147 07:30:20 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:10.147 07:30:20 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:10.147 07:30:20 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:10.147 07:30:20 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:10.147 07:30:20 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:10.407 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:10.407 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:10.407 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:10.407 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:10.407 07:30:21 -- setup/devices.sh@12 -- # cleanup_dm 00:05:10.407 07:30:21 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:10.407 07:30:21 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:10.407 07:30:21 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:10.407 07:30:21 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:10.407 07:30:21 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:10.407 07:30:21 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:10.407 00:05:10.407 real 0m26.234s 00:05:10.407 user 0m7.261s 00:05:10.407 sys 0m13.689s 00:05:10.407 07:30:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:10.407 07:30:21 -- common/autotest_common.sh@10 -- # set +x 00:05:10.407 ************************************ 00:05:10.407 END TEST devices 00:05:10.407 ************************************ 00:05:10.407 00:05:10.407 real 1m31.271s 00:05:10.407 user 0m27.887s 00:05:10.407 sys 0m51.844s 00:05:10.407 07:30:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:10.407 07:30:21 -- common/autotest_common.sh@10 -- # set +x 00:05:10.407 ************************************ 00:05:10.407 END TEST setup.sh 00:05:10.407 ************************************ 00:05:10.407 07:30:21 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:13.702 Hugepages 00:05:13.702 node hugesize free / total 00:05:13.702 node0 1048576kB 0 / 0 00:05:13.702 node0 2048kB 2048 / 2048 00:05:13.702 node1 1048576kB 0 / 0 00:05:13.702 node1 2048kB 0 / 0 00:05:13.702 00:05:13.702 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:13.702 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:13.702 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:13.702 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:13.702 07:30:24 -- spdk/autotest.sh@128 -- # uname -s 00:05:13.702 07:30:24 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:13.702 07:30:24 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:13.702 07:30:24 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:16.994 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.994 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.374 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:18.634 07:30:29 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:19.573 07:30:30 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:19.573 07:30:30 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:19.573 07:30:30 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:19.573 07:30:30 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:19.573 07:30:30 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:19.573 07:30:30 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:19.573 07:30:30 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:19.574 07:30:30 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:19.574 07:30:30 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:19.574 07:30:30 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:19.574 07:30:30 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:19.574 07:30:30 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:22.863 Waiting for block devices as requested 00:05:22.863 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:23.121 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:23.121 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:23.121 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:23.121 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:23.380 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:23.380 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:23.380 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:23.639 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:23.639 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:23.639 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:23.899 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:23.899 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:23.899 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:24.157 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:24.157 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:24.157 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:24.416 07:30:35 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:24.416 07:30:35 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:24.416 07:30:35 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:24.416 07:30:35 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:24.416 07:30:35 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:24.416 07:30:35 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:24.416 07:30:35 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:24.416 07:30:35 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:24.416 07:30:35 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:24.416 07:30:35 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:24.416 07:30:35 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:24.416 07:30:35 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:24.416 07:30:35 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:24.416 07:30:35 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:24.416 07:30:35 -- common/autotest_common.sh@1552 -- # continue 00:05:24.416 07:30:35 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:24.416 07:30:35 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:24.416 07:30:35 -- common/autotest_common.sh@10 -- # set +x 00:05:24.416 07:30:35 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:24.416 07:30:35 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:24.416 07:30:35 -- common/autotest_common.sh@10 -- # set +x 00:05:24.416 07:30:35 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:28.610 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:28.610 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:29.992 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:29.992 07:30:40 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:29.992 07:30:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:29.992 07:30:40 -- common/autotest_common.sh@10 -- # set +x 00:05:29.992 07:30:40 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:29.992 07:30:40 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:29.992 07:30:40 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:29.992 07:30:40 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:29.992 07:30:40 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:29.992 07:30:40 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:29.992 07:30:40 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:29.992 07:30:40 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:29.992 07:30:40 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:29.992 07:30:40 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:29.992 07:30:40 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:29.992 07:30:40 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:29.992 07:30:40 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:29.992 07:30:40 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:29.992 07:30:40 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:29.992 07:30:40 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:29.992 07:30:40 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:29.992 07:30:40 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:29.992 07:30:40 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:29.992 07:30:40 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:29.992 07:30:40 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=1641942 00:05:29.992 07:30:40 -- common/autotest_common.sh@1593 -- # waitforlisten 1641942 00:05:29.992 07:30:40 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:29.992 07:30:40 -- common/autotest_common.sh@829 -- # '[' -z 1641942 ']' 00:05:29.992 07:30:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:29.992 07:30:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:29.992 07:30:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:29.992 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:29.992 07:30:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:29.992 07:30:40 -- common/autotest_common.sh@10 -- # set +x 00:05:29.992 [2024-11-28 07:30:40.604590] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:29.992 [2024-11-28 07:30:40.604649] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641942 ] 00:05:29.992 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.992 [2024-11-28 07:30:40.669508] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:29.992 [2024-11-28 07:30:40.709428] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:29.992 [2024-11-28 07:30:40.709544] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:30.929 07:30:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:30.929 07:30:41 -- common/autotest_common.sh@862 -- # return 0 00:05:30.929 07:30:41 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:30.929 07:30:41 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:30.929 07:30:41 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:34.224 nvme0n1 00:05:34.224 07:30:44 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:34.224 [2024-11-28 07:30:44.610687] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:34.224 request: 00:05:34.224 { 00:05:34.224 "nvme_ctrlr_name": "nvme0", 00:05:34.224 "password": "test", 00:05:34.224 "method": "bdev_nvme_opal_revert", 00:05:34.224 "req_id": 1 00:05:34.224 } 00:05:34.224 Got JSON-RPC error response 00:05:34.224 response: 00:05:34.224 { 00:05:34.224 "code": -32602, 00:05:34.224 "message": "Invalid parameters" 00:05:34.224 } 00:05:34.224 07:30:44 -- common/autotest_common.sh@1599 -- # true 00:05:34.224 07:30:44 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:34.224 07:30:44 -- common/autotest_common.sh@1603 -- # killprocess 1641942 00:05:34.224 07:30:44 -- common/autotest_common.sh@936 -- # '[' -z 1641942 ']' 00:05:34.224 07:30:44 -- common/autotest_common.sh@940 -- # kill -0 1641942 00:05:34.224 07:30:44 -- common/autotest_common.sh@941 -- # uname 00:05:34.224 07:30:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:34.224 07:30:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1641942 00:05:34.224 07:30:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:34.224 07:30:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:34.224 07:30:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1641942' 00:05:34.224 killing process with pid 1641942 00:05:34.224 07:30:44 -- common/autotest_common.sh@955 -- # kill 1641942 00:05:34.224 07:30:44 -- common/autotest_common.sh@960 -- # wait 1641942 00:05:34.224 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.225 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:34.226 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:36.131 07:30:46 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:36.131 07:30:46 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:36.131 07:30:46 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:36.131 07:30:46 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:36.131 07:30:46 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:36.131 07:30:46 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:36.131 07:30:46 -- common/autotest_common.sh@10 -- # set +x 00:05:36.131 07:30:46 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:36.131 07:30:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:36.131 07:30:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:36.131 07:30:46 -- common/autotest_common.sh@10 -- # set +x 00:05:36.131 ************************************ 00:05:36.131 START TEST env 00:05:36.131 ************************************ 00:05:36.131 07:30:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:36.392 * Looking for test storage... 00:05:36.392 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:36.392 07:30:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:36.392 07:30:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:36.392 07:30:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:36.392 07:30:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:36.392 07:30:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:36.392 07:30:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:36.392 07:30:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:36.392 07:30:47 -- scripts/common.sh@335 -- # IFS=.-: 00:05:36.392 07:30:47 -- scripts/common.sh@335 -- # read -ra ver1 00:05:36.392 07:30:47 -- scripts/common.sh@336 -- # IFS=.-: 00:05:36.392 07:30:47 -- scripts/common.sh@336 -- # read -ra ver2 00:05:36.392 07:30:47 -- scripts/common.sh@337 -- # local 'op=<' 00:05:36.392 07:30:47 -- scripts/common.sh@339 -- # ver1_l=2 00:05:36.392 07:30:47 -- scripts/common.sh@340 -- # ver2_l=1 00:05:36.392 07:30:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:36.392 07:30:47 -- scripts/common.sh@343 -- # case "$op" in 00:05:36.392 07:30:47 -- scripts/common.sh@344 -- # : 1 00:05:36.392 07:30:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:36.392 07:30:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:36.392 07:30:47 -- scripts/common.sh@364 -- # decimal 1 00:05:36.392 07:30:47 -- scripts/common.sh@352 -- # local d=1 00:05:36.392 07:30:47 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:36.392 07:30:47 -- scripts/common.sh@354 -- # echo 1 00:05:36.392 07:30:47 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:36.392 07:30:47 -- scripts/common.sh@365 -- # decimal 2 00:05:36.392 07:30:47 -- scripts/common.sh@352 -- # local d=2 00:05:36.392 07:30:47 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:36.392 07:30:47 -- scripts/common.sh@354 -- # echo 2 00:05:36.392 07:30:47 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:36.392 07:30:47 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:36.392 07:30:47 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:36.392 07:30:47 -- scripts/common.sh@367 -- # return 0 00:05:36.392 07:30:47 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:36.392 07:30:47 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:36.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:36.392 --rc genhtml_branch_coverage=1 00:05:36.392 --rc genhtml_function_coverage=1 00:05:36.392 --rc genhtml_legend=1 00:05:36.392 --rc geninfo_all_blocks=1 00:05:36.392 --rc geninfo_unexecuted_blocks=1 00:05:36.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:36.392 ' 00:05:36.392 07:30:47 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:36.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:36.392 --rc genhtml_branch_coverage=1 00:05:36.392 --rc genhtml_function_coverage=1 00:05:36.392 --rc genhtml_legend=1 00:05:36.392 --rc geninfo_all_blocks=1 00:05:36.392 --rc geninfo_unexecuted_blocks=1 00:05:36.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:36.392 ' 00:05:36.392 07:30:47 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:36.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:36.392 --rc genhtml_branch_coverage=1 00:05:36.392 --rc genhtml_function_coverage=1 00:05:36.392 --rc genhtml_legend=1 00:05:36.392 --rc geninfo_all_blocks=1 00:05:36.392 --rc geninfo_unexecuted_blocks=1 00:05:36.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:36.392 ' 00:05:36.392 07:30:47 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:36.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:36.392 --rc genhtml_branch_coverage=1 00:05:36.392 --rc genhtml_function_coverage=1 00:05:36.392 --rc genhtml_legend=1 00:05:36.392 --rc geninfo_all_blocks=1 00:05:36.392 --rc geninfo_unexecuted_blocks=1 00:05:36.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:36.392 ' 00:05:36.392 07:30:47 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:36.392 07:30:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:36.392 07:30:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:36.392 07:30:47 -- common/autotest_common.sh@10 -- # set +x 00:05:36.392 ************************************ 00:05:36.392 START TEST env_memory 00:05:36.392 ************************************ 00:05:36.392 07:30:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:36.392 00:05:36.392 00:05:36.392 CUnit - A unit testing framework for C - Version 2.1-3 00:05:36.392 http://cunit.sourceforge.net/ 00:05:36.392 00:05:36.392 00:05:36.392 Suite: memory 00:05:36.392 Test: alloc and free memory map ...[2024-11-28 07:30:47.122378] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:36.392 passed 00:05:36.392 Test: mem map translation ...[2024-11-28 07:30:47.135047] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:36.392 [2024-11-28 07:30:47.135074] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:36.392 [2024-11-28 07:30:47.135103] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:36.392 [2024-11-28 07:30:47.135111] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:36.392 passed 00:05:36.392 Test: mem map registration ...[2024-11-28 07:30:47.154522] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:36.392 [2024-11-28 07:30:47.154540] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:36.392 passed 00:05:36.653 Test: mem map adjacent registrations ...passed 00:05:36.653 00:05:36.653 Run Summary: Type Total Ran Passed Failed Inactive 00:05:36.653 suites 1 1 n/a 0 0 00:05:36.653 tests 4 4 4 0 0 00:05:36.653 asserts 152 152 152 0 n/a 00:05:36.653 00:05:36.653 Elapsed time = 0.082 seconds 00:05:36.653 00:05:36.653 real 0m0.095s 00:05:36.653 user 0m0.081s 00:05:36.653 sys 0m0.014s 00:05:36.653 07:30:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:36.653 07:30:47 -- common/autotest_common.sh@10 -- # set +x 00:05:36.653 ************************************ 00:05:36.653 END TEST env_memory 00:05:36.653 ************************************ 00:05:36.653 07:30:47 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:36.653 07:30:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:36.653 07:30:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:36.653 07:30:47 -- common/autotest_common.sh@10 -- # set +x 00:05:36.653 ************************************ 00:05:36.653 START TEST env_vtophys 00:05:36.653 ************************************ 00:05:36.653 07:30:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:36.653 EAL: lib.eal log level changed from notice to debug 00:05:36.653 EAL: Detected lcore 0 as core 0 on socket 0 00:05:36.653 EAL: Detected lcore 1 as core 1 on socket 0 00:05:36.653 EAL: Detected lcore 2 as core 2 on socket 0 00:05:36.653 EAL: Detected lcore 3 as core 3 on socket 0 00:05:36.653 EAL: Detected lcore 4 as core 4 on socket 0 00:05:36.653 EAL: Detected lcore 5 as core 5 on socket 0 00:05:36.653 EAL: Detected lcore 6 as core 6 on socket 0 00:05:36.653 EAL: Detected lcore 7 as core 8 on socket 0 00:05:36.653 EAL: Detected lcore 8 as core 9 on socket 0 00:05:36.653 EAL: Detected lcore 9 as core 10 on socket 0 00:05:36.653 EAL: Detected lcore 10 as core 11 on socket 0 00:05:36.653 EAL: Detected lcore 11 as core 12 on socket 0 00:05:36.653 EAL: Detected lcore 12 as core 13 on socket 0 00:05:36.653 EAL: Detected lcore 13 as core 14 on socket 0 00:05:36.653 EAL: Detected lcore 14 as core 16 on socket 0 00:05:36.653 EAL: Detected lcore 15 as core 17 on socket 0 00:05:36.653 EAL: Detected lcore 16 as core 18 on socket 0 00:05:36.653 EAL: Detected lcore 17 as core 19 on socket 0 00:05:36.653 EAL: Detected lcore 18 as core 20 on socket 0 00:05:36.653 EAL: Detected lcore 19 as core 21 on socket 0 00:05:36.653 EAL: Detected lcore 20 as core 22 on socket 0 00:05:36.653 EAL: Detected lcore 21 as core 24 on socket 0 00:05:36.653 EAL: Detected lcore 22 as core 25 on socket 0 00:05:36.653 EAL: Detected lcore 23 as core 26 on socket 0 00:05:36.653 EAL: Detected lcore 24 as core 27 on socket 0 00:05:36.653 EAL: Detected lcore 25 as core 28 on socket 0 00:05:36.653 EAL: Detected lcore 26 as core 29 on socket 0 00:05:36.653 EAL: Detected lcore 27 as core 30 on socket 0 00:05:36.653 EAL: Detected lcore 28 as core 0 on socket 1 00:05:36.653 EAL: Detected lcore 29 as core 1 on socket 1 00:05:36.653 EAL: Detected lcore 30 as core 2 on socket 1 00:05:36.653 EAL: Detected lcore 31 as core 3 on socket 1 00:05:36.653 EAL: Detected lcore 32 as core 4 on socket 1 00:05:36.653 EAL: Detected lcore 33 as core 5 on socket 1 00:05:36.653 EAL: Detected lcore 34 as core 6 on socket 1 00:05:36.653 EAL: Detected lcore 35 as core 8 on socket 1 00:05:36.653 EAL: Detected lcore 36 as core 9 on socket 1 00:05:36.653 EAL: Detected lcore 37 as core 10 on socket 1 00:05:36.653 EAL: Detected lcore 38 as core 11 on socket 1 00:05:36.653 EAL: Detected lcore 39 as core 12 on socket 1 00:05:36.653 EAL: Detected lcore 40 as core 13 on socket 1 00:05:36.653 EAL: Detected lcore 41 as core 14 on socket 1 00:05:36.653 EAL: Detected lcore 42 as core 16 on socket 1 00:05:36.653 EAL: Detected lcore 43 as core 17 on socket 1 00:05:36.653 EAL: Detected lcore 44 as core 18 on socket 1 00:05:36.653 EAL: Detected lcore 45 as core 19 on socket 1 00:05:36.653 EAL: Detected lcore 46 as core 20 on socket 1 00:05:36.653 EAL: Detected lcore 47 as core 21 on socket 1 00:05:36.653 EAL: Detected lcore 48 as core 22 on socket 1 00:05:36.653 EAL: Detected lcore 49 as core 24 on socket 1 00:05:36.653 EAL: Detected lcore 50 as core 25 on socket 1 00:05:36.653 EAL: Detected lcore 51 as core 26 on socket 1 00:05:36.653 EAL: Detected lcore 52 as core 27 on socket 1 00:05:36.653 EAL: Detected lcore 53 as core 28 on socket 1 00:05:36.653 EAL: Detected lcore 54 as core 29 on socket 1 00:05:36.653 EAL: Detected lcore 55 as core 30 on socket 1 00:05:36.653 EAL: Detected lcore 56 as core 0 on socket 0 00:05:36.653 EAL: Detected lcore 57 as core 1 on socket 0 00:05:36.653 EAL: Detected lcore 58 as core 2 on socket 0 00:05:36.653 EAL: Detected lcore 59 as core 3 on socket 0 00:05:36.653 EAL: Detected lcore 60 as core 4 on socket 0 00:05:36.653 EAL: Detected lcore 61 as core 5 on socket 0 00:05:36.653 EAL: Detected lcore 62 as core 6 on socket 0 00:05:36.653 EAL: Detected lcore 63 as core 8 on socket 0 00:05:36.653 EAL: Detected lcore 64 as core 9 on socket 0 00:05:36.653 EAL: Detected lcore 65 as core 10 on socket 0 00:05:36.653 EAL: Detected lcore 66 as core 11 on socket 0 00:05:36.653 EAL: Detected lcore 67 as core 12 on socket 0 00:05:36.653 EAL: Detected lcore 68 as core 13 on socket 0 00:05:36.653 EAL: Detected lcore 69 as core 14 on socket 0 00:05:36.653 EAL: Detected lcore 70 as core 16 on socket 0 00:05:36.653 EAL: Detected lcore 71 as core 17 on socket 0 00:05:36.653 EAL: Detected lcore 72 as core 18 on socket 0 00:05:36.653 EAL: Detected lcore 73 as core 19 on socket 0 00:05:36.653 EAL: Detected lcore 74 as core 20 on socket 0 00:05:36.653 EAL: Detected lcore 75 as core 21 on socket 0 00:05:36.653 EAL: Detected lcore 76 as core 22 on socket 0 00:05:36.653 EAL: Detected lcore 77 as core 24 on socket 0 00:05:36.653 EAL: Detected lcore 78 as core 25 on socket 0 00:05:36.653 EAL: Detected lcore 79 as core 26 on socket 0 00:05:36.653 EAL: Detected lcore 80 as core 27 on socket 0 00:05:36.653 EAL: Detected lcore 81 as core 28 on socket 0 00:05:36.653 EAL: Detected lcore 82 as core 29 on socket 0 00:05:36.653 EAL: Detected lcore 83 as core 30 on socket 0 00:05:36.653 EAL: Detected lcore 84 as core 0 on socket 1 00:05:36.653 EAL: Detected lcore 85 as core 1 on socket 1 00:05:36.653 EAL: Detected lcore 86 as core 2 on socket 1 00:05:36.653 EAL: Detected lcore 87 as core 3 on socket 1 00:05:36.653 EAL: Detected lcore 88 as core 4 on socket 1 00:05:36.653 EAL: Detected lcore 89 as core 5 on socket 1 00:05:36.653 EAL: Detected lcore 90 as core 6 on socket 1 00:05:36.653 EAL: Detected lcore 91 as core 8 on socket 1 00:05:36.653 EAL: Detected lcore 92 as core 9 on socket 1 00:05:36.653 EAL: Detected lcore 93 as core 10 on socket 1 00:05:36.653 EAL: Detected lcore 94 as core 11 on socket 1 00:05:36.653 EAL: Detected lcore 95 as core 12 on socket 1 00:05:36.653 EAL: Detected lcore 96 as core 13 on socket 1 00:05:36.653 EAL: Detected lcore 97 as core 14 on socket 1 00:05:36.653 EAL: Detected lcore 98 as core 16 on socket 1 00:05:36.653 EAL: Detected lcore 99 as core 17 on socket 1 00:05:36.653 EAL: Detected lcore 100 as core 18 on socket 1 00:05:36.653 EAL: Detected lcore 101 as core 19 on socket 1 00:05:36.653 EAL: Detected lcore 102 as core 20 on socket 1 00:05:36.653 EAL: Detected lcore 103 as core 21 on socket 1 00:05:36.653 EAL: Detected lcore 104 as core 22 on socket 1 00:05:36.653 EAL: Detected lcore 105 as core 24 on socket 1 00:05:36.653 EAL: Detected lcore 106 as core 25 on socket 1 00:05:36.653 EAL: Detected lcore 107 as core 26 on socket 1 00:05:36.653 EAL: Detected lcore 108 as core 27 on socket 1 00:05:36.653 EAL: Detected lcore 109 as core 28 on socket 1 00:05:36.653 EAL: Detected lcore 110 as core 29 on socket 1 00:05:36.653 EAL: Detected lcore 111 as core 30 on socket 1 00:05:36.653 EAL: Maximum logical cores by configuration: 128 00:05:36.653 EAL: Detected CPU lcores: 112 00:05:36.653 EAL: Detected NUMA nodes: 2 00:05:36.653 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:36.653 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:36.653 EAL: Checking presence of .so 'librte_eal.so' 00:05:36.653 EAL: Detected static linkage of DPDK 00:05:36.653 EAL: No shared files mode enabled, IPC will be disabled 00:05:36.653 EAL: Bus pci wants IOVA as 'DC' 00:05:36.653 EAL: Buses did not request a specific IOVA mode. 00:05:36.653 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:36.653 EAL: Selected IOVA mode 'VA' 00:05:36.653 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.653 EAL: Probing VFIO support... 00:05:36.653 EAL: IOMMU type 1 (Type 1) is supported 00:05:36.653 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:36.653 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:36.654 EAL: VFIO support initialized 00:05:36.654 EAL: Ask a virtual area of 0x2e000 bytes 00:05:36.654 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:36.654 EAL: Setting up physically contiguous memory... 00:05:36.654 EAL: Setting maximum number of open files to 524288 00:05:36.654 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:36.654 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:36.654 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:36.654 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:36.654 EAL: Ask a virtual area of 0x61000 bytes 00:05:36.654 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:36.654 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:36.654 EAL: Ask a virtual area of 0x400000000 bytes 00:05:36.654 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:36.654 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:36.654 EAL: Hugepages will be freed exactly as allocated. 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: TSC frequency is ~2500000 KHz 00:05:36.654 EAL: Main lcore 0 is ready (tid=7f1613c8ea00;cpuset=[0]) 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 0 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 2MB 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Mem event callback 'spdk:(nil)' registered 00:05:36.654 00:05:36.654 00:05:36.654 CUnit - A unit testing framework for C - Version 2.1-3 00:05:36.654 http://cunit.sourceforge.net/ 00:05:36.654 00:05:36.654 00:05:36.654 Suite: components_suite 00:05:36.654 Test: vtophys_malloc_test ...passed 00:05:36.654 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 4MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 4MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 6MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 6MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 10MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 10MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 18MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 18MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 34MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 34MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 66MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 66MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.654 EAL: Restoring previous memory policy: 4 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was expanded by 130MB 00:05:36.654 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.654 EAL: request: mp_malloc_sync 00:05:36.654 EAL: No shared files mode enabled, IPC is disabled 00:05:36.654 EAL: Heap on socket 0 was shrunk by 130MB 00:05:36.654 EAL: Trying to obtain current memory policy. 00:05:36.654 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.914 EAL: Restoring previous memory policy: 4 00:05:36.914 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.914 EAL: request: mp_malloc_sync 00:05:36.914 EAL: No shared files mode enabled, IPC is disabled 00:05:36.914 EAL: Heap on socket 0 was expanded by 258MB 00:05:36.914 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.914 EAL: request: mp_malloc_sync 00:05:36.914 EAL: No shared files mode enabled, IPC is disabled 00:05:36.914 EAL: Heap on socket 0 was shrunk by 258MB 00:05:36.914 EAL: Trying to obtain current memory policy. 00:05:36.914 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:36.914 EAL: Restoring previous memory policy: 4 00:05:36.914 EAL: Calling mem event callback 'spdk:(nil)' 00:05:36.914 EAL: request: mp_malloc_sync 00:05:36.914 EAL: No shared files mode enabled, IPC is disabled 00:05:36.914 EAL: Heap on socket 0 was expanded by 514MB 00:05:37.174 EAL: Calling mem event callback 'spdk:(nil)' 00:05:37.174 EAL: request: mp_malloc_sync 00:05:37.174 EAL: No shared files mode enabled, IPC is disabled 00:05:37.174 EAL: Heap on socket 0 was shrunk by 514MB 00:05:37.174 EAL: Trying to obtain current memory policy. 00:05:37.174 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:37.468 EAL: Restoring previous memory policy: 4 00:05:37.468 EAL: Calling mem event callback 'spdk:(nil)' 00:05:37.468 EAL: request: mp_malloc_sync 00:05:37.468 EAL: No shared files mode enabled, IPC is disabled 00:05:37.468 EAL: Heap on socket 0 was expanded by 1026MB 00:05:37.468 EAL: Calling mem event callback 'spdk:(nil)' 00:05:37.788 EAL: request: mp_malloc_sync 00:05:37.788 EAL: No shared files mode enabled, IPC is disabled 00:05:37.788 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:37.788 passed 00:05:37.788 00:05:37.788 Run Summary: Type Total Ran Passed Failed Inactive 00:05:37.788 suites 1 1 n/a 0 0 00:05:37.788 tests 2 2 2 0 0 00:05:37.788 asserts 497 497 497 0 n/a 00:05:37.788 00:05:37.788 Elapsed time = 0.954 seconds 00:05:37.788 EAL: Calling mem event callback 'spdk:(nil)' 00:05:37.788 EAL: request: mp_malloc_sync 00:05:37.788 EAL: No shared files mode enabled, IPC is disabled 00:05:37.788 EAL: Heap on socket 0 was shrunk by 2MB 00:05:37.788 EAL: No shared files mode enabled, IPC is disabled 00:05:37.788 EAL: No shared files mode enabled, IPC is disabled 00:05:37.788 EAL: No shared files mode enabled, IPC is disabled 00:05:37.788 00:05:37.788 real 0m1.067s 00:05:37.788 user 0m0.619s 00:05:37.788 sys 0m0.425s 00:05:37.788 07:30:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.788 07:30:48 -- common/autotest_common.sh@10 -- # set +x 00:05:37.788 ************************************ 00:05:37.788 END TEST env_vtophys 00:05:37.788 ************************************ 00:05:37.788 07:30:48 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:37.788 07:30:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:37.788 07:30:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.788 07:30:48 -- common/autotest_common.sh@10 -- # set +x 00:05:37.788 ************************************ 00:05:37.788 START TEST env_pci 00:05:37.788 ************************************ 00:05:37.788 07:30:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:37.788 00:05:37.788 00:05:37.788 CUnit - A unit testing framework for C - Version 2.1-3 00:05:37.788 http://cunit.sourceforge.net/ 00:05:37.788 00:05:37.788 00:05:37.788 Suite: pci 00:05:37.788 Test: pci_hook ...[2024-11-28 07:30:48.354214] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1643427 has claimed it 00:05:37.788 EAL: Cannot find device (10000:00:01.0) 00:05:37.788 EAL: Failed to attach device on primary process 00:05:37.788 passed 00:05:37.788 00:05:37.788 Run Summary: Type Total Ran Passed Failed Inactive 00:05:37.788 suites 1 1 n/a 0 0 00:05:37.788 tests 1 1 1 0 0 00:05:37.788 asserts 25 25 25 0 n/a 00:05:37.788 00:05:37.788 Elapsed time = 0.039 seconds 00:05:37.788 00:05:37.788 real 0m0.057s 00:05:37.788 user 0m0.012s 00:05:37.788 sys 0m0.045s 00:05:37.788 07:30:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.788 07:30:48 -- common/autotest_common.sh@10 -- # set +x 00:05:37.788 ************************************ 00:05:37.788 END TEST env_pci 00:05:37.788 ************************************ 00:05:37.788 07:30:48 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:37.788 07:30:48 -- env/env.sh@15 -- # uname 00:05:37.788 07:30:48 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:37.788 07:30:48 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:37.788 07:30:48 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:37.788 07:30:48 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:37.788 07:30:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.788 07:30:48 -- common/autotest_common.sh@10 -- # set +x 00:05:37.788 ************************************ 00:05:37.788 START TEST env_dpdk_post_init 00:05:37.788 ************************************ 00:05:37.788 07:30:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:37.788 EAL: Detected CPU lcores: 112 00:05:37.788 EAL: Detected NUMA nodes: 2 00:05:37.788 EAL: Detected static linkage of DPDK 00:05:37.788 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:37.788 EAL: Selected IOVA mode 'VA' 00:05:37.788 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.788 EAL: VFIO support initialized 00:05:37.788 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:38.048 EAL: Using IOMMU type 1 (Type 1) 00:05:38.618 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:42.816 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:42.816 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:42.816 Starting DPDK initialization... 00:05:42.816 Starting SPDK post initialization... 00:05:42.816 SPDK NVMe probe 00:05:42.816 Attaching to 0000:d8:00.0 00:05:42.816 Attached to 0000:d8:00.0 00:05:42.816 Cleaning up... 00:05:42.816 00:05:42.816 real 0m4.719s 00:05:42.816 user 0m3.557s 00:05:42.816 sys 0m0.405s 00:05:42.816 07:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.816 07:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.816 ************************************ 00:05:42.816 END TEST env_dpdk_post_init 00:05:42.816 ************************************ 00:05:42.816 07:30:53 -- env/env.sh@26 -- # uname 00:05:42.816 07:30:53 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:42.816 07:30:53 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:42.816 07:30:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.816 07:30:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.816 07:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.816 ************************************ 00:05:42.816 START TEST env_mem_callbacks 00:05:42.816 ************************************ 00:05:42.816 07:30:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:42.816 EAL: Detected CPU lcores: 112 00:05:42.816 EAL: Detected NUMA nodes: 2 00:05:42.816 EAL: Detected static linkage of DPDK 00:05:42.816 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:42.816 EAL: Selected IOVA mode 'VA' 00:05:42.816 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.816 EAL: VFIO support initialized 00:05:42.816 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:42.816 00:05:42.816 00:05:42.816 CUnit - A unit testing framework for C - Version 2.1-3 00:05:42.816 http://cunit.sourceforge.net/ 00:05:42.816 00:05:42.816 00:05:42.816 Suite: memory 00:05:42.816 Test: test ... 00:05:42.816 register 0x200000200000 2097152 00:05:42.816 malloc 3145728 00:05:42.816 register 0x200000400000 4194304 00:05:42.816 buf 0x200000500000 len 3145728 PASSED 00:05:42.816 malloc 64 00:05:42.816 buf 0x2000004fff40 len 64 PASSED 00:05:42.816 malloc 4194304 00:05:42.816 register 0x200000800000 6291456 00:05:42.816 buf 0x200000a00000 len 4194304 PASSED 00:05:42.816 free 0x200000500000 3145728 00:05:42.816 free 0x2000004fff40 64 00:05:42.816 unregister 0x200000400000 4194304 PASSED 00:05:42.816 free 0x200000a00000 4194304 00:05:42.816 unregister 0x200000800000 6291456 PASSED 00:05:42.816 malloc 8388608 00:05:42.816 register 0x200000400000 10485760 00:05:42.816 buf 0x200000600000 len 8388608 PASSED 00:05:42.816 free 0x200000600000 8388608 00:05:42.816 unregister 0x200000400000 10485760 PASSED 00:05:42.816 passed 00:05:42.816 00:05:42.816 Run Summary: Type Total Ran Passed Failed Inactive 00:05:42.816 suites 1 1 n/a 0 0 00:05:42.816 tests 1 1 1 0 0 00:05:42.816 asserts 15 15 15 0 n/a 00:05:42.816 00:05:42.816 Elapsed time = 0.004 seconds 00:05:42.816 00:05:42.816 real 0m0.048s 00:05:42.816 user 0m0.011s 00:05:42.816 sys 0m0.036s 00:05:42.816 07:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.816 07:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.816 ************************************ 00:05:42.816 END TEST env_mem_callbacks 00:05:42.816 ************************************ 00:05:42.816 00:05:42.816 real 0m6.408s 00:05:42.816 user 0m4.470s 00:05:42.816 sys 0m1.213s 00:05:42.816 07:30:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.816 07:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.816 ************************************ 00:05:42.816 END TEST env 00:05:42.816 ************************************ 00:05:42.816 07:30:53 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:42.816 07:30:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.816 07:30:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.816 07:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.816 ************************************ 00:05:42.816 START TEST rpc 00:05:42.816 ************************************ 00:05:42.816 07:30:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:42.816 * Looking for test storage... 00:05:42.816 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:42.816 07:30:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.816 07:30:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.816 07:30:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.816 07:30:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.816 07:30:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.816 07:30:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.816 07:30:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.816 07:30:53 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.816 07:30:53 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.816 07:30:53 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.816 07:30:53 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.816 07:30:53 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.816 07:30:53 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.816 07:30:53 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.816 07:30:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.816 07:30:53 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.816 07:30:53 -- scripts/common.sh@344 -- # : 1 00:05:42.816 07:30:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.816 07:30:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.816 07:30:53 -- scripts/common.sh@364 -- # decimal 1 00:05:42.816 07:30:53 -- scripts/common.sh@352 -- # local d=1 00:05:42.816 07:30:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.816 07:30:53 -- scripts/common.sh@354 -- # echo 1 00:05:42.816 07:30:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.816 07:30:53 -- scripts/common.sh@365 -- # decimal 2 00:05:42.816 07:30:53 -- scripts/common.sh@352 -- # local d=2 00:05:42.816 07:30:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.816 07:30:53 -- scripts/common.sh@354 -- # echo 2 00:05:42.816 07:30:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.816 07:30:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.816 07:30:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.816 07:30:53 -- scripts/common.sh@367 -- # return 0 00:05:42.816 07:30:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.816 07:30:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.816 --rc genhtml_branch_coverage=1 00:05:42.816 --rc genhtml_function_coverage=1 00:05:42.816 --rc genhtml_legend=1 00:05:42.816 --rc geninfo_all_blocks=1 00:05:42.816 --rc geninfo_unexecuted_blocks=1 00:05:42.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.816 ' 00:05:42.816 07:30:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.816 --rc genhtml_branch_coverage=1 00:05:42.816 --rc genhtml_function_coverage=1 00:05:42.816 --rc genhtml_legend=1 00:05:42.816 --rc geninfo_all_blocks=1 00:05:42.816 --rc geninfo_unexecuted_blocks=1 00:05:42.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.816 ' 00:05:42.816 07:30:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.816 --rc genhtml_branch_coverage=1 00:05:42.816 --rc genhtml_function_coverage=1 00:05:42.816 --rc genhtml_legend=1 00:05:42.816 --rc geninfo_all_blocks=1 00:05:42.816 --rc geninfo_unexecuted_blocks=1 00:05:42.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.816 ' 00:05:42.817 07:30:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.817 --rc genhtml_branch_coverage=1 00:05:42.817 --rc genhtml_function_coverage=1 00:05:42.817 --rc genhtml_legend=1 00:05:42.817 --rc geninfo_all_blocks=1 00:05:42.817 --rc geninfo_unexecuted_blocks=1 00:05:42.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.817 ' 00:05:42.817 07:30:53 -- rpc/rpc.sh@65 -- # spdk_pid=1644451 00:05:42.817 07:30:53 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:42.817 07:30:53 -- rpc/rpc.sh@67 -- # waitforlisten 1644451 00:05:42.817 07:30:53 -- common/autotest_common.sh@829 -- # '[' -z 1644451 ']' 00:05:42.817 07:30:53 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.817 07:30:53 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:42.817 07:30:53 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.817 07:30:53 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:42.817 07:30:53 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:42.817 07:30:53 -- common/autotest_common.sh@10 -- # set +x 00:05:42.817 [2024-11-28 07:30:53.557775] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:42.817 [2024-11-28 07:30:53.557854] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1644451 ] 00:05:43.076 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.076 [2024-11-28 07:30:53.625763] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.076 [2024-11-28 07:30:53.662629] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:43.076 [2024-11-28 07:30:53.662737] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:43.076 [2024-11-28 07:30:53.662748] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1644451' to capture a snapshot of events at runtime. 00:05:43.076 [2024-11-28 07:30:53.662759] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1644451 for offline analysis/debug. 00:05:43.076 [2024-11-28 07:30:53.662782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.645 07:30:54 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:43.645 07:30:54 -- common/autotest_common.sh@862 -- # return 0 00:05:43.645 07:30:54 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:43.645 07:30:54 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:43.645 07:30:54 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:43.645 07:30:54 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:43.645 07:30:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.645 07:30:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.645 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.645 ************************************ 00:05:43.645 START TEST rpc_integrity 00:05:43.645 ************************************ 00:05:43.645 07:30:54 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:43.645 07:30:54 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:43.645 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.645 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.645 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.645 07:30:54 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:43.645 07:30:54 -- rpc/rpc.sh@13 -- # jq length 00:05:43.645 07:30:54 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:43.645 07:30:54 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:43.645 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.645 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:43.905 07:30:54 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:43.905 { 00:05:43.905 "name": "Malloc0", 00:05:43.905 "aliases": [ 00:05:43.905 "ba46d3b0-e984-4e54-84cb-86f7ea00ca87" 00:05:43.905 ], 00:05:43.905 "product_name": "Malloc disk", 00:05:43.905 "block_size": 512, 00:05:43.905 "num_blocks": 16384, 00:05:43.905 "uuid": "ba46d3b0-e984-4e54-84cb-86f7ea00ca87", 00:05:43.905 "assigned_rate_limits": { 00:05:43.905 "rw_ios_per_sec": 0, 00:05:43.905 "rw_mbytes_per_sec": 0, 00:05:43.905 "r_mbytes_per_sec": 0, 00:05:43.905 "w_mbytes_per_sec": 0 00:05:43.905 }, 00:05:43.905 "claimed": false, 00:05:43.905 "zoned": false, 00:05:43.905 "supported_io_types": { 00:05:43.905 "read": true, 00:05:43.905 "write": true, 00:05:43.905 "unmap": true, 00:05:43.905 "write_zeroes": true, 00:05:43.905 "flush": true, 00:05:43.905 "reset": true, 00:05:43.905 "compare": false, 00:05:43.905 "compare_and_write": false, 00:05:43.905 "abort": true, 00:05:43.905 "nvme_admin": false, 00:05:43.905 "nvme_io": false 00:05:43.905 }, 00:05:43.905 "memory_domains": [ 00:05:43.905 { 00:05:43.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.905 "dma_device_type": 2 00:05:43.905 } 00:05:43.905 ], 00:05:43.905 "driver_specific": {} 00:05:43.905 } 00:05:43.905 ]' 00:05:43.905 07:30:54 -- rpc/rpc.sh@17 -- # jq length 00:05:43.905 07:30:54 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:43.905 07:30:54 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 [2024-11-28 07:30:54.487835] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:43.905 [2024-11-28 07:30:54.487869] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:43.905 [2024-11-28 07:30:54.487892] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x53db850 00:05:43.905 [2024-11-28 07:30:54.487902] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:43.905 [2024-11-28 07:30:54.488712] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:43.905 [2024-11-28 07:30:54.488735] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:43.905 Passthru0 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:43.905 { 00:05:43.905 "name": "Malloc0", 00:05:43.905 "aliases": [ 00:05:43.905 "ba46d3b0-e984-4e54-84cb-86f7ea00ca87" 00:05:43.905 ], 00:05:43.905 "product_name": "Malloc disk", 00:05:43.905 "block_size": 512, 00:05:43.905 "num_blocks": 16384, 00:05:43.905 "uuid": "ba46d3b0-e984-4e54-84cb-86f7ea00ca87", 00:05:43.905 "assigned_rate_limits": { 00:05:43.905 "rw_ios_per_sec": 0, 00:05:43.905 "rw_mbytes_per_sec": 0, 00:05:43.905 "r_mbytes_per_sec": 0, 00:05:43.905 "w_mbytes_per_sec": 0 00:05:43.905 }, 00:05:43.905 "claimed": true, 00:05:43.905 "claim_type": "exclusive_write", 00:05:43.905 "zoned": false, 00:05:43.905 "supported_io_types": { 00:05:43.905 "read": true, 00:05:43.905 "write": true, 00:05:43.905 "unmap": true, 00:05:43.905 "write_zeroes": true, 00:05:43.905 "flush": true, 00:05:43.905 "reset": true, 00:05:43.905 "compare": false, 00:05:43.905 "compare_and_write": false, 00:05:43.905 "abort": true, 00:05:43.905 "nvme_admin": false, 00:05:43.905 "nvme_io": false 00:05:43.905 }, 00:05:43.905 "memory_domains": [ 00:05:43.905 { 00:05:43.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.905 "dma_device_type": 2 00:05:43.905 } 00:05:43.905 ], 00:05:43.905 "driver_specific": {} 00:05:43.905 }, 00:05:43.905 { 00:05:43.905 "name": "Passthru0", 00:05:43.905 "aliases": [ 00:05:43.905 "f5e091ea-63e4-550e-b0a3-2a13261a848f" 00:05:43.905 ], 00:05:43.905 "product_name": "passthru", 00:05:43.905 "block_size": 512, 00:05:43.905 "num_blocks": 16384, 00:05:43.905 "uuid": "f5e091ea-63e4-550e-b0a3-2a13261a848f", 00:05:43.905 "assigned_rate_limits": { 00:05:43.905 "rw_ios_per_sec": 0, 00:05:43.905 "rw_mbytes_per_sec": 0, 00:05:43.905 "r_mbytes_per_sec": 0, 00:05:43.905 "w_mbytes_per_sec": 0 00:05:43.905 }, 00:05:43.905 "claimed": false, 00:05:43.905 "zoned": false, 00:05:43.905 "supported_io_types": { 00:05:43.905 "read": true, 00:05:43.905 "write": true, 00:05:43.905 "unmap": true, 00:05:43.905 "write_zeroes": true, 00:05:43.905 "flush": true, 00:05:43.905 "reset": true, 00:05:43.905 "compare": false, 00:05:43.905 "compare_and_write": false, 00:05:43.905 "abort": true, 00:05:43.905 "nvme_admin": false, 00:05:43.905 "nvme_io": false 00:05:43.905 }, 00:05:43.905 "memory_domains": [ 00:05:43.905 { 00:05:43.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.905 "dma_device_type": 2 00:05:43.905 } 00:05:43.905 ], 00:05:43.905 "driver_specific": { 00:05:43.905 "passthru": { 00:05:43.905 "name": "Passthru0", 00:05:43.905 "base_bdev_name": "Malloc0" 00:05:43.905 } 00:05:43.905 } 00:05:43.905 } 00:05:43.905 ]' 00:05:43.905 07:30:54 -- rpc/rpc.sh@21 -- # jq length 00:05:43.905 07:30:54 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:43.905 07:30:54 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.905 07:30:54 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:43.905 07:30:54 -- rpc/rpc.sh@26 -- # jq length 00:05:43.905 07:30:54 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:43.905 00:05:43.905 real 0m0.253s 00:05:43.905 user 0m0.155s 00:05:43.905 sys 0m0.031s 00:05:43.905 07:30:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 ************************************ 00:05:43.905 END TEST rpc_integrity 00:05:43.905 ************************************ 00:05:43.905 07:30:54 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:43.905 07:30:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.905 07:30:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.905 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.905 ************************************ 00:05:43.905 START TEST rpc_plugins 00:05:43.905 ************************************ 00:05:43.905 07:30:54 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:43.905 07:30:54 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:43.905 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:43.906 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:43.906 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.906 07:30:54 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:44.165 07:30:54 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:44.165 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.165 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:44.165 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.165 07:30:54 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:44.165 { 00:05:44.165 "name": "Malloc1", 00:05:44.165 "aliases": [ 00:05:44.165 "4db9a01d-faff-4b54-8784-55802db4d8d8" 00:05:44.165 ], 00:05:44.165 "product_name": "Malloc disk", 00:05:44.165 "block_size": 4096, 00:05:44.165 "num_blocks": 256, 00:05:44.165 "uuid": "4db9a01d-faff-4b54-8784-55802db4d8d8", 00:05:44.165 "assigned_rate_limits": { 00:05:44.165 "rw_ios_per_sec": 0, 00:05:44.165 "rw_mbytes_per_sec": 0, 00:05:44.165 "r_mbytes_per_sec": 0, 00:05:44.165 "w_mbytes_per_sec": 0 00:05:44.165 }, 00:05:44.165 "claimed": false, 00:05:44.165 "zoned": false, 00:05:44.165 "supported_io_types": { 00:05:44.165 "read": true, 00:05:44.165 "write": true, 00:05:44.165 "unmap": true, 00:05:44.165 "write_zeroes": true, 00:05:44.165 "flush": true, 00:05:44.165 "reset": true, 00:05:44.165 "compare": false, 00:05:44.165 "compare_and_write": false, 00:05:44.165 "abort": true, 00:05:44.165 "nvme_admin": false, 00:05:44.165 "nvme_io": false 00:05:44.165 }, 00:05:44.165 "memory_domains": [ 00:05:44.165 { 00:05:44.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:44.165 "dma_device_type": 2 00:05:44.165 } 00:05:44.165 ], 00:05:44.165 "driver_specific": {} 00:05:44.165 } 00:05:44.165 ]' 00:05:44.165 07:30:54 -- rpc/rpc.sh@32 -- # jq length 00:05:44.165 07:30:54 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:44.165 07:30:54 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:44.165 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.165 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:44.165 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.166 07:30:54 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:44.166 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.166 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:44.166 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.166 07:30:54 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:44.166 07:30:54 -- rpc/rpc.sh@36 -- # jq length 00:05:44.166 07:30:54 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:44.166 00:05:44.166 real 0m0.135s 00:05:44.166 user 0m0.079s 00:05:44.166 sys 0m0.021s 00:05:44.166 07:30:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.166 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:44.166 ************************************ 00:05:44.166 END TEST rpc_plugins 00:05:44.166 ************************************ 00:05:44.166 07:30:54 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:44.166 07:30:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.166 07:30:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.166 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:44.166 ************************************ 00:05:44.166 START TEST rpc_trace_cmd_test 00:05:44.166 ************************************ 00:05:44.166 07:30:54 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:44.166 07:30:54 -- rpc/rpc.sh@40 -- # local info 00:05:44.166 07:30:54 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:44.166 07:30:54 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.166 07:30:54 -- common/autotest_common.sh@10 -- # set +x 00:05:44.166 07:30:54 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.166 07:30:54 -- rpc/rpc.sh@42 -- # info='{ 00:05:44.166 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1644451", 00:05:44.166 "tpoint_group_mask": "0x8", 00:05:44.166 "iscsi_conn": { 00:05:44.166 "mask": "0x2", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "scsi": { 00:05:44.166 "mask": "0x4", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "bdev": { 00:05:44.166 "mask": "0x8", 00:05:44.166 "tpoint_mask": "0xffffffffffffffff" 00:05:44.166 }, 00:05:44.166 "nvmf_rdma": { 00:05:44.166 "mask": "0x10", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "nvmf_tcp": { 00:05:44.166 "mask": "0x20", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "ftl": { 00:05:44.166 "mask": "0x40", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "blobfs": { 00:05:44.166 "mask": "0x80", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "dsa": { 00:05:44.166 "mask": "0x200", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "thread": { 00:05:44.166 "mask": "0x400", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "nvme_pcie": { 00:05:44.166 "mask": "0x800", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "iaa": { 00:05:44.166 "mask": "0x1000", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "nvme_tcp": { 00:05:44.166 "mask": "0x2000", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 }, 00:05:44.166 "bdev_nvme": { 00:05:44.166 "mask": "0x4000", 00:05:44.166 "tpoint_mask": "0x0" 00:05:44.166 } 00:05:44.166 }' 00:05:44.166 07:30:54 -- rpc/rpc.sh@43 -- # jq length 00:05:44.166 07:30:54 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:44.166 07:30:54 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:44.166 07:30:54 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:44.166 07:30:54 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:44.425 07:30:54 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:44.425 07:30:54 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:44.425 07:30:54 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:44.425 07:30:54 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:44.426 07:30:55 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:44.426 00:05:44.426 real 0m0.197s 00:05:44.426 user 0m0.161s 00:05:44.426 sys 0m0.026s 00:05:44.426 07:30:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.426 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.426 ************************************ 00:05:44.426 END TEST rpc_trace_cmd_test 00:05:44.426 ************************************ 00:05:44.426 07:30:55 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:44.426 07:30:55 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:44.426 07:30:55 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:44.426 07:30:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.426 07:30:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.426 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.426 ************************************ 00:05:44.426 START TEST rpc_daemon_integrity 00:05:44.426 ************************************ 00:05:44.426 07:30:55 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:44.426 07:30:55 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:44.426 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.426 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.426 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.426 07:30:55 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:44.426 07:30:55 -- rpc/rpc.sh@13 -- # jq length 00:05:44.426 07:30:55 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:44.426 07:30:55 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:44.426 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.426 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.426 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.426 07:30:55 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:44.426 07:30:55 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:44.426 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.426 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.426 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.426 07:30:55 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:44.426 { 00:05:44.426 "name": "Malloc2", 00:05:44.426 "aliases": [ 00:05:44.426 "5f87c612-cd58-422c-99ab-e939f0616804" 00:05:44.426 ], 00:05:44.426 "product_name": "Malloc disk", 00:05:44.426 "block_size": 512, 00:05:44.426 "num_blocks": 16384, 00:05:44.426 "uuid": "5f87c612-cd58-422c-99ab-e939f0616804", 00:05:44.426 "assigned_rate_limits": { 00:05:44.426 "rw_ios_per_sec": 0, 00:05:44.426 "rw_mbytes_per_sec": 0, 00:05:44.426 "r_mbytes_per_sec": 0, 00:05:44.426 "w_mbytes_per_sec": 0 00:05:44.426 }, 00:05:44.426 "claimed": false, 00:05:44.426 "zoned": false, 00:05:44.426 "supported_io_types": { 00:05:44.426 "read": true, 00:05:44.426 "write": true, 00:05:44.426 "unmap": true, 00:05:44.426 "write_zeroes": true, 00:05:44.426 "flush": true, 00:05:44.426 "reset": true, 00:05:44.426 "compare": false, 00:05:44.426 "compare_and_write": false, 00:05:44.426 "abort": true, 00:05:44.426 "nvme_admin": false, 00:05:44.426 "nvme_io": false 00:05:44.426 }, 00:05:44.426 "memory_domains": [ 00:05:44.426 { 00:05:44.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:44.426 "dma_device_type": 2 00:05:44.426 } 00:05:44.426 ], 00:05:44.426 "driver_specific": {} 00:05:44.426 } 00:05:44.426 ]' 00:05:44.426 07:30:55 -- rpc/rpc.sh@17 -- # jq length 00:05:44.685 07:30:55 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:44.685 07:30:55 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:44.685 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.685 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.685 [2024-11-28 07:30:55.209734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:44.685 [2024-11-28 07:30:55.209764] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:44.685 [2024-11-28 07:30:55.209781] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x53dd4c0 00:05:44.685 [2024-11-28 07:30:55.209791] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:44.685 [2024-11-28 07:30:55.210470] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:44.685 [2024-11-28 07:30:55.210490] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:44.685 Passthru0 00:05:44.685 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.685 07:30:55 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:44.685 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.685 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.685 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.685 07:30:55 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:44.685 { 00:05:44.685 "name": "Malloc2", 00:05:44.685 "aliases": [ 00:05:44.685 "5f87c612-cd58-422c-99ab-e939f0616804" 00:05:44.685 ], 00:05:44.685 "product_name": "Malloc disk", 00:05:44.685 "block_size": 512, 00:05:44.685 "num_blocks": 16384, 00:05:44.685 "uuid": "5f87c612-cd58-422c-99ab-e939f0616804", 00:05:44.685 "assigned_rate_limits": { 00:05:44.685 "rw_ios_per_sec": 0, 00:05:44.685 "rw_mbytes_per_sec": 0, 00:05:44.685 "r_mbytes_per_sec": 0, 00:05:44.685 "w_mbytes_per_sec": 0 00:05:44.685 }, 00:05:44.685 "claimed": true, 00:05:44.685 "claim_type": "exclusive_write", 00:05:44.685 "zoned": false, 00:05:44.685 "supported_io_types": { 00:05:44.685 "read": true, 00:05:44.685 "write": true, 00:05:44.685 "unmap": true, 00:05:44.685 "write_zeroes": true, 00:05:44.685 "flush": true, 00:05:44.685 "reset": true, 00:05:44.685 "compare": false, 00:05:44.685 "compare_and_write": false, 00:05:44.685 "abort": true, 00:05:44.685 "nvme_admin": false, 00:05:44.685 "nvme_io": false 00:05:44.685 }, 00:05:44.685 "memory_domains": [ 00:05:44.685 { 00:05:44.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:44.685 "dma_device_type": 2 00:05:44.685 } 00:05:44.685 ], 00:05:44.685 "driver_specific": {} 00:05:44.685 }, 00:05:44.685 { 00:05:44.685 "name": "Passthru0", 00:05:44.685 "aliases": [ 00:05:44.685 "f6cd6a16-b449-5e0c-9e03-8150bc961930" 00:05:44.685 ], 00:05:44.685 "product_name": "passthru", 00:05:44.685 "block_size": 512, 00:05:44.685 "num_blocks": 16384, 00:05:44.685 "uuid": "f6cd6a16-b449-5e0c-9e03-8150bc961930", 00:05:44.685 "assigned_rate_limits": { 00:05:44.685 "rw_ios_per_sec": 0, 00:05:44.685 "rw_mbytes_per_sec": 0, 00:05:44.685 "r_mbytes_per_sec": 0, 00:05:44.685 "w_mbytes_per_sec": 0 00:05:44.685 }, 00:05:44.685 "claimed": false, 00:05:44.685 "zoned": false, 00:05:44.685 "supported_io_types": { 00:05:44.685 "read": true, 00:05:44.685 "write": true, 00:05:44.685 "unmap": true, 00:05:44.685 "write_zeroes": true, 00:05:44.685 "flush": true, 00:05:44.685 "reset": true, 00:05:44.685 "compare": false, 00:05:44.685 "compare_and_write": false, 00:05:44.685 "abort": true, 00:05:44.685 "nvme_admin": false, 00:05:44.685 "nvme_io": false 00:05:44.685 }, 00:05:44.685 "memory_domains": [ 00:05:44.685 { 00:05:44.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:44.685 "dma_device_type": 2 00:05:44.685 } 00:05:44.685 ], 00:05:44.685 "driver_specific": { 00:05:44.685 "passthru": { 00:05:44.685 "name": "Passthru0", 00:05:44.685 "base_bdev_name": "Malloc2" 00:05:44.685 } 00:05:44.685 } 00:05:44.685 } 00:05:44.685 ]' 00:05:44.685 07:30:55 -- rpc/rpc.sh@21 -- # jq length 00:05:44.685 07:30:55 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:44.685 07:30:55 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:44.685 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.685 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.685 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.685 07:30:55 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:44.685 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.685 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.685 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.685 07:30:55 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:44.685 07:30:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:44.685 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.685 07:30:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:44.685 07:30:55 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:44.685 07:30:55 -- rpc/rpc.sh@26 -- # jq length 00:05:44.685 07:30:55 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:44.685 00:05:44.685 real 0m0.272s 00:05:44.685 user 0m0.171s 00:05:44.685 sys 0m0.034s 00:05:44.685 07:30:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.685 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:44.685 ************************************ 00:05:44.685 END TEST rpc_daemon_integrity 00:05:44.685 ************************************ 00:05:44.685 07:30:55 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:44.685 07:30:55 -- rpc/rpc.sh@84 -- # killprocess 1644451 00:05:44.685 07:30:55 -- common/autotest_common.sh@936 -- # '[' -z 1644451 ']' 00:05:44.685 07:30:55 -- common/autotest_common.sh@940 -- # kill -0 1644451 00:05:44.685 07:30:55 -- common/autotest_common.sh@941 -- # uname 00:05:44.685 07:30:55 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:44.685 07:30:55 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1644451 00:05:44.944 07:30:55 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:44.944 07:30:55 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:44.944 07:30:55 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1644451' 00:05:44.944 killing process with pid 1644451 00:05:44.944 07:30:55 -- common/autotest_common.sh@955 -- # kill 1644451 00:05:44.944 07:30:55 -- common/autotest_common.sh@960 -- # wait 1644451 00:05:45.204 00:05:45.204 real 0m2.388s 00:05:45.204 user 0m2.960s 00:05:45.204 sys 0m0.690s 00:05:45.204 07:30:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:45.204 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:45.204 ************************************ 00:05:45.204 END TEST rpc 00:05:45.204 ************************************ 00:05:45.204 07:30:55 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:45.204 07:30:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.204 07:30:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.204 07:30:55 -- common/autotest_common.sh@10 -- # set +x 00:05:45.204 ************************************ 00:05:45.204 START TEST rpc_client 00:05:45.204 ************************************ 00:05:45.204 07:30:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:45.204 * Looking for test storage... 00:05:45.204 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:45.204 07:30:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:45.204 07:30:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:45.204 07:30:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:45.204 07:30:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:45.204 07:30:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:45.204 07:30:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:45.204 07:30:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:45.204 07:30:55 -- scripts/common.sh@335 -- # IFS=.-: 00:05:45.204 07:30:55 -- scripts/common.sh@335 -- # read -ra ver1 00:05:45.204 07:30:55 -- scripts/common.sh@336 -- # IFS=.-: 00:05:45.204 07:30:55 -- scripts/common.sh@336 -- # read -ra ver2 00:05:45.204 07:30:55 -- scripts/common.sh@337 -- # local 'op=<' 00:05:45.204 07:30:55 -- scripts/common.sh@339 -- # ver1_l=2 00:05:45.204 07:30:55 -- scripts/common.sh@340 -- # ver2_l=1 00:05:45.204 07:30:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:45.204 07:30:55 -- scripts/common.sh@343 -- # case "$op" in 00:05:45.204 07:30:55 -- scripts/common.sh@344 -- # : 1 00:05:45.204 07:30:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:45.204 07:30:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:45.204 07:30:55 -- scripts/common.sh@364 -- # decimal 1 00:05:45.464 07:30:55 -- scripts/common.sh@352 -- # local d=1 00:05:45.464 07:30:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:45.464 07:30:55 -- scripts/common.sh@354 -- # echo 1 00:05:45.464 07:30:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:45.464 07:30:55 -- scripts/common.sh@365 -- # decimal 2 00:05:45.464 07:30:55 -- scripts/common.sh@352 -- # local d=2 00:05:45.464 07:30:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:45.464 07:30:55 -- scripts/common.sh@354 -- # echo 2 00:05:45.464 07:30:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:45.464 07:30:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:45.464 07:30:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:45.464 07:30:55 -- scripts/common.sh@367 -- # return 0 00:05:45.464 07:30:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:45.464 07:30:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.464 --rc genhtml_branch_coverage=1 00:05:45.464 --rc genhtml_function_coverage=1 00:05:45.464 --rc genhtml_legend=1 00:05:45.464 --rc geninfo_all_blocks=1 00:05:45.464 --rc geninfo_unexecuted_blocks=1 00:05:45.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.464 ' 00:05:45.464 07:30:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.464 --rc genhtml_branch_coverage=1 00:05:45.464 --rc genhtml_function_coverage=1 00:05:45.464 --rc genhtml_legend=1 00:05:45.464 --rc geninfo_all_blocks=1 00:05:45.464 --rc geninfo_unexecuted_blocks=1 00:05:45.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.464 ' 00:05:45.464 07:30:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.464 --rc genhtml_branch_coverage=1 00:05:45.464 --rc genhtml_function_coverage=1 00:05:45.464 --rc genhtml_legend=1 00:05:45.464 --rc geninfo_all_blocks=1 00:05:45.465 --rc geninfo_unexecuted_blocks=1 00:05:45.465 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.465 ' 00:05:45.465 07:30:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:45.465 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.465 --rc genhtml_branch_coverage=1 00:05:45.465 --rc genhtml_function_coverage=1 00:05:45.465 --rc genhtml_legend=1 00:05:45.465 --rc geninfo_all_blocks=1 00:05:45.465 --rc geninfo_unexecuted_blocks=1 00:05:45.465 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.465 ' 00:05:45.465 07:30:55 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:45.465 OK 00:05:45.465 07:30:56 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:45.465 00:05:45.465 real 0m0.212s 00:05:45.465 user 0m0.120s 00:05:45.465 sys 0m0.110s 00:05:45.465 07:30:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:45.465 07:30:56 -- common/autotest_common.sh@10 -- # set +x 00:05:45.465 ************************************ 00:05:45.465 END TEST rpc_client 00:05:45.465 ************************************ 00:05:45.465 07:30:56 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:45.465 07:30:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.465 07:30:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.465 07:30:56 -- common/autotest_common.sh@10 -- # set +x 00:05:45.465 ************************************ 00:05:45.465 START TEST json_config 00:05:45.465 ************************************ 00:05:45.465 07:30:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:45.465 07:30:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:45.465 07:30:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:45.465 07:30:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:45.465 07:30:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:45.465 07:30:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:45.465 07:30:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:45.465 07:30:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:45.465 07:30:56 -- scripts/common.sh@335 -- # IFS=.-: 00:05:45.465 07:30:56 -- scripts/common.sh@335 -- # read -ra ver1 00:05:45.465 07:30:56 -- scripts/common.sh@336 -- # IFS=.-: 00:05:45.465 07:30:56 -- scripts/common.sh@336 -- # read -ra ver2 00:05:45.465 07:30:56 -- scripts/common.sh@337 -- # local 'op=<' 00:05:45.465 07:30:56 -- scripts/common.sh@339 -- # ver1_l=2 00:05:45.465 07:30:56 -- scripts/common.sh@340 -- # ver2_l=1 00:05:45.465 07:30:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:45.465 07:30:56 -- scripts/common.sh@343 -- # case "$op" in 00:05:45.465 07:30:56 -- scripts/common.sh@344 -- # : 1 00:05:45.465 07:30:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:45.465 07:30:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:45.465 07:30:56 -- scripts/common.sh@364 -- # decimal 1 00:05:45.465 07:30:56 -- scripts/common.sh@352 -- # local d=1 00:05:45.465 07:30:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:45.465 07:30:56 -- scripts/common.sh@354 -- # echo 1 00:05:45.465 07:30:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:45.465 07:30:56 -- scripts/common.sh@365 -- # decimal 2 00:05:45.465 07:30:56 -- scripts/common.sh@352 -- # local d=2 00:05:45.465 07:30:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:45.465 07:30:56 -- scripts/common.sh@354 -- # echo 2 00:05:45.465 07:30:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:45.465 07:30:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:45.465 07:30:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:45.465 07:30:56 -- scripts/common.sh@367 -- # return 0 00:05:45.465 07:30:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:45.465 07:30:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:45.465 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.465 --rc genhtml_branch_coverage=1 00:05:45.465 --rc genhtml_function_coverage=1 00:05:45.465 --rc genhtml_legend=1 00:05:45.465 --rc geninfo_all_blocks=1 00:05:45.465 --rc geninfo_unexecuted_blocks=1 00:05:45.465 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.465 ' 00:05:45.465 07:30:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:45.465 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.465 --rc genhtml_branch_coverage=1 00:05:45.465 --rc genhtml_function_coverage=1 00:05:45.465 --rc genhtml_legend=1 00:05:45.465 --rc geninfo_all_blocks=1 00:05:45.465 --rc geninfo_unexecuted_blocks=1 00:05:45.465 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.465 ' 00:05:45.465 07:30:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:45.465 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.465 --rc genhtml_branch_coverage=1 00:05:45.465 --rc genhtml_function_coverage=1 00:05:45.465 --rc genhtml_legend=1 00:05:45.465 --rc geninfo_all_blocks=1 00:05:45.465 --rc geninfo_unexecuted_blocks=1 00:05:45.465 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.465 ' 00:05:45.465 07:30:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:45.465 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.465 --rc genhtml_branch_coverage=1 00:05:45.465 --rc genhtml_function_coverage=1 00:05:45.465 --rc genhtml_legend=1 00:05:45.465 --rc geninfo_all_blocks=1 00:05:45.465 --rc geninfo_unexecuted_blocks=1 00:05:45.465 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.465 ' 00:05:45.465 07:30:56 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:45.465 07:30:56 -- nvmf/common.sh@7 -- # uname -s 00:05:45.465 07:30:56 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:45.465 07:30:56 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:45.465 07:30:56 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:45.465 07:30:56 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:45.465 07:30:56 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:45.465 07:30:56 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:45.465 07:30:56 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:45.465 07:30:56 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:45.465 07:30:56 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:45.465 07:30:56 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:45.465 07:30:56 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:45.465 07:30:56 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:45.465 07:30:56 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:45.465 07:30:56 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:45.465 07:30:56 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:45.725 07:30:56 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:45.725 07:30:56 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:45.725 07:30:56 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:45.725 07:30:56 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:45.726 07:30:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- paths/export.sh@5 -- # export PATH 00:05:45.726 07:30:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- nvmf/common.sh@46 -- # : 0 00:05:45.726 07:30:56 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:45.726 07:30:56 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:45.726 07:30:56 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:45.726 07:30:56 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:45.726 07:30:56 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:45.726 07:30:56 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:45.726 07:30:56 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:45.726 07:30:56 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:45.726 07:30:56 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:45.726 07:30:56 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:45.726 07:30:56 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:45.726 07:30:56 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:45.726 07:30:56 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:45.726 WARNING: No tests are enabled so not running JSON configuration tests 00:05:45.726 07:30:56 -- json_config/json_config.sh@27 -- # exit 0 00:05:45.726 00:05:45.726 real 0m0.188s 00:05:45.726 user 0m0.107s 00:05:45.726 sys 0m0.090s 00:05:45.726 07:30:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:45.726 07:30:56 -- common/autotest_common.sh@10 -- # set +x 00:05:45.726 ************************************ 00:05:45.726 END TEST json_config 00:05:45.726 ************************************ 00:05:45.726 07:30:56 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:45.726 07:30:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.726 07:30:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.726 07:30:56 -- common/autotest_common.sh@10 -- # set +x 00:05:45.726 ************************************ 00:05:45.726 START TEST json_config_extra_key 00:05:45.726 ************************************ 00:05:45.726 07:30:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:45.726 07:30:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:45.726 07:30:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:45.726 07:30:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:45.726 07:30:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:45.726 07:30:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:45.726 07:30:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:45.726 07:30:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:45.726 07:30:56 -- scripts/common.sh@335 -- # IFS=.-: 00:05:45.726 07:30:56 -- scripts/common.sh@335 -- # read -ra ver1 00:05:45.726 07:30:56 -- scripts/common.sh@336 -- # IFS=.-: 00:05:45.726 07:30:56 -- scripts/common.sh@336 -- # read -ra ver2 00:05:45.726 07:30:56 -- scripts/common.sh@337 -- # local 'op=<' 00:05:45.726 07:30:56 -- scripts/common.sh@339 -- # ver1_l=2 00:05:45.726 07:30:56 -- scripts/common.sh@340 -- # ver2_l=1 00:05:45.726 07:30:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:45.726 07:30:56 -- scripts/common.sh@343 -- # case "$op" in 00:05:45.726 07:30:56 -- scripts/common.sh@344 -- # : 1 00:05:45.726 07:30:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:45.726 07:30:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:45.726 07:30:56 -- scripts/common.sh@364 -- # decimal 1 00:05:45.726 07:30:56 -- scripts/common.sh@352 -- # local d=1 00:05:45.726 07:30:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:45.726 07:30:56 -- scripts/common.sh@354 -- # echo 1 00:05:45.726 07:30:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:45.726 07:30:56 -- scripts/common.sh@365 -- # decimal 2 00:05:45.726 07:30:56 -- scripts/common.sh@352 -- # local d=2 00:05:45.726 07:30:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:45.726 07:30:56 -- scripts/common.sh@354 -- # echo 2 00:05:45.726 07:30:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:45.726 07:30:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:45.726 07:30:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:45.726 07:30:56 -- scripts/common.sh@367 -- # return 0 00:05:45.726 07:30:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:45.726 07:30:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:45.726 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.726 --rc genhtml_branch_coverage=1 00:05:45.726 --rc genhtml_function_coverage=1 00:05:45.726 --rc genhtml_legend=1 00:05:45.726 --rc geninfo_all_blocks=1 00:05:45.726 --rc geninfo_unexecuted_blocks=1 00:05:45.726 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.726 ' 00:05:45.726 07:30:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:45.726 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.726 --rc genhtml_branch_coverage=1 00:05:45.726 --rc genhtml_function_coverage=1 00:05:45.726 --rc genhtml_legend=1 00:05:45.726 --rc geninfo_all_blocks=1 00:05:45.726 --rc geninfo_unexecuted_blocks=1 00:05:45.726 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.726 ' 00:05:45.726 07:30:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:45.726 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.726 --rc genhtml_branch_coverage=1 00:05:45.726 --rc genhtml_function_coverage=1 00:05:45.726 --rc genhtml_legend=1 00:05:45.726 --rc geninfo_all_blocks=1 00:05:45.726 --rc geninfo_unexecuted_blocks=1 00:05:45.726 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.726 ' 00:05:45.726 07:30:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:45.726 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.726 --rc genhtml_branch_coverage=1 00:05:45.726 --rc genhtml_function_coverage=1 00:05:45.726 --rc genhtml_legend=1 00:05:45.726 --rc geninfo_all_blocks=1 00:05:45.726 --rc geninfo_unexecuted_blocks=1 00:05:45.726 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.726 ' 00:05:45.726 07:30:56 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:45.726 07:30:56 -- nvmf/common.sh@7 -- # uname -s 00:05:45.726 07:30:56 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:45.726 07:30:56 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:45.726 07:30:56 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:45.726 07:30:56 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:45.726 07:30:56 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:45.726 07:30:56 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:45.726 07:30:56 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:45.726 07:30:56 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:45.726 07:30:56 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:45.726 07:30:56 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:45.726 07:30:56 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:45.726 07:30:56 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:45.726 07:30:56 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:45.726 07:30:56 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:45.726 07:30:56 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:45.726 07:30:56 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:45.726 07:30:56 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:45.726 07:30:56 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:45.726 07:30:56 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:45.726 07:30:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.726 07:30:56 -- paths/export.sh@5 -- # export PATH 00:05:45.727 07:30:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.727 07:30:56 -- nvmf/common.sh@46 -- # : 0 00:05:45.727 07:30:56 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:45.727 07:30:56 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:45.727 07:30:56 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:45.727 07:30:56 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:45.727 07:30:56 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:45.727 07:30:56 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:45.727 07:30:56 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:45.727 07:30:56 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:45.727 INFO: launching applications... 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=1645248 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:45.727 Waiting for target to run... 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 1645248 /var/tmp/spdk_tgt.sock 00:05:45.727 07:30:56 -- common/autotest_common.sh@829 -- # '[' -z 1645248 ']' 00:05:45.727 07:30:56 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:45.727 07:30:56 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:45.727 07:30:56 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:45.727 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:45.727 07:30:56 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:45.727 07:30:56 -- common/autotest_common.sh@10 -- # set +x 00:05:45.727 07:30:56 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:45.727 [2024-11-28 07:30:56.466790] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:45.727 [2024-11-28 07:30:56.466882] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645248 ] 00:05:45.986 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.986 [2024-11-28 07:30:56.748623] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.246 [2024-11-28 07:30:56.768137] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:46.246 [2024-11-28 07:30:56.768240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.814 07:30:57 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:46.814 07:30:57 -- common/autotest_common.sh@862 -- # return 0 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:46.814 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:46.814 INFO: shutting down applications... 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 1645248 ]] 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 1645248 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1645248 00:05:46.814 07:30:57 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1645248 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:47.074 SPDK target shutdown done 00:05:47.074 07:30:57 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:47.074 Success 00:05:47.074 00:05:47.074 real 0m1.507s 00:05:47.074 user 0m1.224s 00:05:47.074 sys 0m0.421s 00:05:47.074 07:30:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:47.074 07:30:57 -- common/autotest_common.sh@10 -- # set +x 00:05:47.074 ************************************ 00:05:47.074 END TEST json_config_extra_key 00:05:47.074 ************************************ 00:05:47.074 07:30:57 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:47.074 07:30:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:47.074 07:30:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.074 07:30:57 -- common/autotest_common.sh@10 -- # set +x 00:05:47.333 ************************************ 00:05:47.333 START TEST alias_rpc 00:05:47.333 ************************************ 00:05:47.333 07:30:57 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:47.333 * Looking for test storage... 00:05:47.333 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:47.333 07:30:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:47.333 07:30:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:47.333 07:30:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:47.333 07:30:58 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:47.333 07:30:58 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:47.333 07:30:58 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:47.333 07:30:58 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:47.333 07:30:58 -- scripts/common.sh@335 -- # IFS=.-: 00:05:47.333 07:30:58 -- scripts/common.sh@335 -- # read -ra ver1 00:05:47.333 07:30:58 -- scripts/common.sh@336 -- # IFS=.-: 00:05:47.333 07:30:58 -- scripts/common.sh@336 -- # read -ra ver2 00:05:47.333 07:30:58 -- scripts/common.sh@337 -- # local 'op=<' 00:05:47.333 07:30:58 -- scripts/common.sh@339 -- # ver1_l=2 00:05:47.333 07:30:58 -- scripts/common.sh@340 -- # ver2_l=1 00:05:47.333 07:30:58 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:47.333 07:30:58 -- scripts/common.sh@343 -- # case "$op" in 00:05:47.333 07:30:58 -- scripts/common.sh@344 -- # : 1 00:05:47.333 07:30:58 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:47.333 07:30:58 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:47.333 07:30:58 -- scripts/common.sh@364 -- # decimal 1 00:05:47.333 07:30:58 -- scripts/common.sh@352 -- # local d=1 00:05:47.334 07:30:58 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:47.334 07:30:58 -- scripts/common.sh@354 -- # echo 1 00:05:47.334 07:30:58 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:47.334 07:30:58 -- scripts/common.sh@365 -- # decimal 2 00:05:47.334 07:30:58 -- scripts/common.sh@352 -- # local d=2 00:05:47.334 07:30:58 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:47.334 07:30:58 -- scripts/common.sh@354 -- # echo 2 00:05:47.334 07:30:58 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:47.334 07:30:58 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:47.334 07:30:58 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:47.334 07:30:58 -- scripts/common.sh@367 -- # return 0 00:05:47.334 07:30:58 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:47.334 07:30:58 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:47.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.334 --rc genhtml_branch_coverage=1 00:05:47.334 --rc genhtml_function_coverage=1 00:05:47.334 --rc genhtml_legend=1 00:05:47.334 --rc geninfo_all_blocks=1 00:05:47.334 --rc geninfo_unexecuted_blocks=1 00:05:47.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.334 ' 00:05:47.334 07:30:58 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:47.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.334 --rc genhtml_branch_coverage=1 00:05:47.334 --rc genhtml_function_coverage=1 00:05:47.334 --rc genhtml_legend=1 00:05:47.334 --rc geninfo_all_blocks=1 00:05:47.334 --rc geninfo_unexecuted_blocks=1 00:05:47.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.334 ' 00:05:47.334 07:30:58 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:47.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.334 --rc genhtml_branch_coverage=1 00:05:47.334 --rc genhtml_function_coverage=1 00:05:47.334 --rc genhtml_legend=1 00:05:47.334 --rc geninfo_all_blocks=1 00:05:47.334 --rc geninfo_unexecuted_blocks=1 00:05:47.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.334 ' 00:05:47.334 07:30:58 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:47.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.334 --rc genhtml_branch_coverage=1 00:05:47.334 --rc genhtml_function_coverage=1 00:05:47.334 --rc genhtml_legend=1 00:05:47.334 --rc geninfo_all_blocks=1 00:05:47.334 --rc geninfo_unexecuted_blocks=1 00:05:47.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.334 ' 00:05:47.334 07:30:58 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:47.334 07:30:58 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.334 07:30:58 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1645573 00:05:47.334 07:30:58 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1645573 00:05:47.334 07:30:58 -- common/autotest_common.sh@829 -- # '[' -z 1645573 ']' 00:05:47.334 07:30:58 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.334 07:30:58 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:47.334 07:30:58 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.334 07:30:58 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:47.334 07:30:58 -- common/autotest_common.sh@10 -- # set +x 00:05:47.334 [2024-11-28 07:30:58.047124] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:47.334 [2024-11-28 07:30:58.047192] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645573 ] 00:05:47.334 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.593 [2024-11-28 07:30:58.111648] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.593 [2024-11-28 07:30:58.147537] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:47.593 [2024-11-28 07:30:58.147665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.162 07:30:58 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:48.162 07:30:58 -- common/autotest_common.sh@862 -- # return 0 00:05:48.162 07:30:58 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:48.422 07:30:59 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1645573 00:05:48.422 07:30:59 -- common/autotest_common.sh@936 -- # '[' -z 1645573 ']' 00:05:48.422 07:30:59 -- common/autotest_common.sh@940 -- # kill -0 1645573 00:05:48.422 07:30:59 -- common/autotest_common.sh@941 -- # uname 00:05:48.422 07:30:59 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:48.422 07:30:59 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1645573 00:05:48.422 07:30:59 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:48.422 07:30:59 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:48.422 07:30:59 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1645573' 00:05:48.422 killing process with pid 1645573 00:05:48.422 07:30:59 -- common/autotest_common.sh@955 -- # kill 1645573 00:05:48.422 07:30:59 -- common/autotest_common.sh@960 -- # wait 1645573 00:05:48.682 00:05:48.682 real 0m1.603s 00:05:48.682 user 0m1.705s 00:05:48.682 sys 0m0.485s 00:05:48.682 07:30:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:48.682 07:30:59 -- common/autotest_common.sh@10 -- # set +x 00:05:48.682 ************************************ 00:05:48.682 END TEST alias_rpc 00:05:48.682 ************************************ 00:05:48.943 07:30:59 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:48.943 07:30:59 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:48.943 07:30:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:48.943 07:30:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:48.943 07:30:59 -- common/autotest_common.sh@10 -- # set +x 00:05:48.943 ************************************ 00:05:48.943 START TEST spdkcli_tcp 00:05:48.943 ************************************ 00:05:48.943 07:30:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:48.943 * Looking for test storage... 00:05:48.943 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:48.943 07:30:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:48.943 07:30:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:48.943 07:30:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:48.943 07:30:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:48.943 07:30:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:48.943 07:30:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:48.943 07:30:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:48.943 07:30:59 -- scripts/common.sh@335 -- # IFS=.-: 00:05:48.943 07:30:59 -- scripts/common.sh@335 -- # read -ra ver1 00:05:48.943 07:30:59 -- scripts/common.sh@336 -- # IFS=.-: 00:05:48.943 07:30:59 -- scripts/common.sh@336 -- # read -ra ver2 00:05:48.943 07:30:59 -- scripts/common.sh@337 -- # local 'op=<' 00:05:48.943 07:30:59 -- scripts/common.sh@339 -- # ver1_l=2 00:05:48.943 07:30:59 -- scripts/common.sh@340 -- # ver2_l=1 00:05:48.943 07:30:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:48.943 07:30:59 -- scripts/common.sh@343 -- # case "$op" in 00:05:48.943 07:30:59 -- scripts/common.sh@344 -- # : 1 00:05:48.943 07:30:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:48.943 07:30:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:48.943 07:30:59 -- scripts/common.sh@364 -- # decimal 1 00:05:48.943 07:30:59 -- scripts/common.sh@352 -- # local d=1 00:05:48.943 07:30:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:48.943 07:30:59 -- scripts/common.sh@354 -- # echo 1 00:05:48.943 07:30:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:48.943 07:30:59 -- scripts/common.sh@365 -- # decimal 2 00:05:48.943 07:30:59 -- scripts/common.sh@352 -- # local d=2 00:05:48.943 07:30:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:48.943 07:30:59 -- scripts/common.sh@354 -- # echo 2 00:05:48.943 07:30:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:48.943 07:30:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:48.943 07:30:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:48.943 07:30:59 -- scripts/common.sh@367 -- # return 0 00:05:48.943 07:30:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:48.943 07:30:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:48.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.943 --rc genhtml_branch_coverage=1 00:05:48.943 --rc genhtml_function_coverage=1 00:05:48.943 --rc genhtml_legend=1 00:05:48.943 --rc geninfo_all_blocks=1 00:05:48.943 --rc geninfo_unexecuted_blocks=1 00:05:48.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.943 ' 00:05:48.943 07:30:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:48.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.943 --rc genhtml_branch_coverage=1 00:05:48.943 --rc genhtml_function_coverage=1 00:05:48.943 --rc genhtml_legend=1 00:05:48.943 --rc geninfo_all_blocks=1 00:05:48.943 --rc geninfo_unexecuted_blocks=1 00:05:48.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.943 ' 00:05:48.943 07:30:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:48.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.943 --rc genhtml_branch_coverage=1 00:05:48.943 --rc genhtml_function_coverage=1 00:05:48.943 --rc genhtml_legend=1 00:05:48.943 --rc geninfo_all_blocks=1 00:05:48.943 --rc geninfo_unexecuted_blocks=1 00:05:48.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.943 ' 00:05:48.943 07:30:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:48.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.943 --rc genhtml_branch_coverage=1 00:05:48.943 --rc genhtml_function_coverage=1 00:05:48.943 --rc genhtml_legend=1 00:05:48.943 --rc geninfo_all_blocks=1 00:05:48.943 --rc geninfo_unexecuted_blocks=1 00:05:48.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.943 ' 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:48.943 07:30:59 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:48.943 07:30:59 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:48.943 07:30:59 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:48.943 07:30:59 -- common/autotest_common.sh@10 -- # set +x 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1645927 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@27 -- # waitforlisten 1645927 00:05:48.943 07:30:59 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:48.943 07:30:59 -- common/autotest_common.sh@829 -- # '[' -z 1645927 ']' 00:05:48.943 07:30:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.943 07:30:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.943 07:30:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.943 07:30:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.943 07:30:59 -- common/autotest_common.sh@10 -- # set +x 00:05:48.943 [2024-11-28 07:30:59.699785] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:48.943 [2024-11-28 07:30:59.699854] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645927 ] 00:05:49.203 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.203 [2024-11-28 07:30:59.765430] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:49.203 [2024-11-28 07:30:59.801580] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:49.203 [2024-11-28 07:30:59.801780] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:49.203 [2024-11-28 07:30:59.801782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.774 07:31:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.774 07:31:00 -- common/autotest_common.sh@862 -- # return 0 00:05:49.774 07:31:00 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:49.774 07:31:00 -- spdkcli/tcp.sh@31 -- # socat_pid=1646071 00:05:49.774 07:31:00 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:50.034 [ 00:05:50.034 "spdk_get_version", 00:05:50.034 "rpc_get_methods", 00:05:50.034 "trace_get_info", 00:05:50.034 "trace_get_tpoint_group_mask", 00:05:50.034 "trace_disable_tpoint_group", 00:05:50.034 "trace_enable_tpoint_group", 00:05:50.034 "trace_clear_tpoint_mask", 00:05:50.034 "trace_set_tpoint_mask", 00:05:50.034 "vfu_tgt_set_base_path", 00:05:50.034 "framework_get_pci_devices", 00:05:50.034 "framework_get_config", 00:05:50.034 "framework_get_subsystems", 00:05:50.034 "iobuf_get_stats", 00:05:50.034 "iobuf_set_options", 00:05:50.034 "sock_set_default_impl", 00:05:50.034 "sock_impl_set_options", 00:05:50.034 "sock_impl_get_options", 00:05:50.034 "vmd_rescan", 00:05:50.034 "vmd_remove_device", 00:05:50.034 "vmd_enable", 00:05:50.034 "accel_get_stats", 00:05:50.034 "accel_set_options", 00:05:50.034 "accel_set_driver", 00:05:50.034 "accel_crypto_key_destroy", 00:05:50.034 "accel_crypto_keys_get", 00:05:50.034 "accel_crypto_key_create", 00:05:50.034 "accel_assign_opc", 00:05:50.034 "accel_get_module_info", 00:05:50.034 "accel_get_opc_assignments", 00:05:50.034 "notify_get_notifications", 00:05:50.034 "notify_get_types", 00:05:50.034 "bdev_get_histogram", 00:05:50.034 "bdev_enable_histogram", 00:05:50.034 "bdev_set_qos_limit", 00:05:50.034 "bdev_set_qd_sampling_period", 00:05:50.034 "bdev_get_bdevs", 00:05:50.034 "bdev_reset_iostat", 00:05:50.034 "bdev_get_iostat", 00:05:50.034 "bdev_examine", 00:05:50.034 "bdev_wait_for_examine", 00:05:50.034 "bdev_set_options", 00:05:50.034 "scsi_get_devices", 00:05:50.034 "thread_set_cpumask", 00:05:50.034 "framework_get_scheduler", 00:05:50.034 "framework_set_scheduler", 00:05:50.034 "framework_get_reactors", 00:05:50.034 "thread_get_io_channels", 00:05:50.034 "thread_get_pollers", 00:05:50.034 "thread_get_stats", 00:05:50.034 "framework_monitor_context_switch", 00:05:50.034 "spdk_kill_instance", 00:05:50.034 "log_enable_timestamps", 00:05:50.034 "log_get_flags", 00:05:50.034 "log_clear_flag", 00:05:50.034 "log_set_flag", 00:05:50.034 "log_get_level", 00:05:50.034 "log_set_level", 00:05:50.034 "log_get_print_level", 00:05:50.034 "log_set_print_level", 00:05:50.034 "framework_enable_cpumask_locks", 00:05:50.034 "framework_disable_cpumask_locks", 00:05:50.034 "framework_wait_init", 00:05:50.034 "framework_start_init", 00:05:50.034 "virtio_blk_create_transport", 00:05:50.034 "virtio_blk_get_transports", 00:05:50.034 "vhost_controller_set_coalescing", 00:05:50.034 "vhost_get_controllers", 00:05:50.034 "vhost_delete_controller", 00:05:50.034 "vhost_create_blk_controller", 00:05:50.034 "vhost_scsi_controller_remove_target", 00:05:50.034 "vhost_scsi_controller_add_target", 00:05:50.034 "vhost_start_scsi_controller", 00:05:50.034 "vhost_create_scsi_controller", 00:05:50.034 "ublk_recover_disk", 00:05:50.034 "ublk_get_disks", 00:05:50.034 "ublk_stop_disk", 00:05:50.034 "ublk_start_disk", 00:05:50.034 "ublk_destroy_target", 00:05:50.034 "ublk_create_target", 00:05:50.034 "nbd_get_disks", 00:05:50.034 "nbd_stop_disk", 00:05:50.034 "nbd_start_disk", 00:05:50.034 "env_dpdk_get_mem_stats", 00:05:50.034 "nvmf_subsystem_get_listeners", 00:05:50.034 "nvmf_subsystem_get_qpairs", 00:05:50.034 "nvmf_subsystem_get_controllers", 00:05:50.034 "nvmf_get_stats", 00:05:50.034 "nvmf_get_transports", 00:05:50.034 "nvmf_create_transport", 00:05:50.034 "nvmf_get_targets", 00:05:50.034 "nvmf_delete_target", 00:05:50.034 "nvmf_create_target", 00:05:50.034 "nvmf_subsystem_allow_any_host", 00:05:50.034 "nvmf_subsystem_remove_host", 00:05:50.034 "nvmf_subsystem_add_host", 00:05:50.034 "nvmf_subsystem_remove_ns", 00:05:50.034 "nvmf_subsystem_add_ns", 00:05:50.034 "nvmf_subsystem_listener_set_ana_state", 00:05:50.034 "nvmf_discovery_get_referrals", 00:05:50.034 "nvmf_discovery_remove_referral", 00:05:50.034 "nvmf_discovery_add_referral", 00:05:50.034 "nvmf_subsystem_remove_listener", 00:05:50.034 "nvmf_subsystem_add_listener", 00:05:50.034 "nvmf_delete_subsystem", 00:05:50.034 "nvmf_create_subsystem", 00:05:50.034 "nvmf_get_subsystems", 00:05:50.034 "nvmf_set_crdt", 00:05:50.034 "nvmf_set_config", 00:05:50.034 "nvmf_set_max_subsystems", 00:05:50.034 "iscsi_set_options", 00:05:50.034 "iscsi_get_auth_groups", 00:05:50.034 "iscsi_auth_group_remove_secret", 00:05:50.034 "iscsi_auth_group_add_secret", 00:05:50.034 "iscsi_delete_auth_group", 00:05:50.034 "iscsi_create_auth_group", 00:05:50.034 "iscsi_set_discovery_auth", 00:05:50.034 "iscsi_get_options", 00:05:50.034 "iscsi_target_node_request_logout", 00:05:50.034 "iscsi_target_node_set_redirect", 00:05:50.034 "iscsi_target_node_set_auth", 00:05:50.034 "iscsi_target_node_add_lun", 00:05:50.034 "iscsi_get_connections", 00:05:50.034 "iscsi_portal_group_set_auth", 00:05:50.034 "iscsi_start_portal_group", 00:05:50.034 "iscsi_delete_portal_group", 00:05:50.034 "iscsi_create_portal_group", 00:05:50.034 "iscsi_get_portal_groups", 00:05:50.034 "iscsi_delete_target_node", 00:05:50.034 "iscsi_target_node_remove_pg_ig_maps", 00:05:50.034 "iscsi_target_node_add_pg_ig_maps", 00:05:50.034 "iscsi_create_target_node", 00:05:50.034 "iscsi_get_target_nodes", 00:05:50.034 "iscsi_delete_initiator_group", 00:05:50.034 "iscsi_initiator_group_remove_initiators", 00:05:50.034 "iscsi_initiator_group_add_initiators", 00:05:50.034 "iscsi_create_initiator_group", 00:05:50.034 "iscsi_get_initiator_groups", 00:05:50.034 "vfu_virtio_create_scsi_endpoint", 00:05:50.034 "vfu_virtio_scsi_remove_target", 00:05:50.034 "vfu_virtio_scsi_add_target", 00:05:50.034 "vfu_virtio_create_blk_endpoint", 00:05:50.034 "vfu_virtio_delete_endpoint", 00:05:50.034 "iaa_scan_accel_module", 00:05:50.034 "dsa_scan_accel_module", 00:05:50.034 "ioat_scan_accel_module", 00:05:50.034 "accel_error_inject_error", 00:05:50.034 "bdev_iscsi_delete", 00:05:50.034 "bdev_iscsi_create", 00:05:50.035 "bdev_iscsi_set_options", 00:05:50.035 "bdev_virtio_attach_controller", 00:05:50.035 "bdev_virtio_scsi_get_devices", 00:05:50.035 "bdev_virtio_detach_controller", 00:05:50.035 "bdev_virtio_blk_set_hotplug", 00:05:50.035 "bdev_ftl_set_property", 00:05:50.035 "bdev_ftl_get_properties", 00:05:50.035 "bdev_ftl_get_stats", 00:05:50.035 "bdev_ftl_unmap", 00:05:50.035 "bdev_ftl_unload", 00:05:50.035 "bdev_ftl_delete", 00:05:50.035 "bdev_ftl_load", 00:05:50.035 "bdev_ftl_create", 00:05:50.035 "bdev_aio_delete", 00:05:50.035 "bdev_aio_rescan", 00:05:50.035 "bdev_aio_create", 00:05:50.035 "blobfs_create", 00:05:50.035 "blobfs_detect", 00:05:50.035 "blobfs_set_cache_size", 00:05:50.035 "bdev_zone_block_delete", 00:05:50.035 "bdev_zone_block_create", 00:05:50.035 "bdev_delay_delete", 00:05:50.035 "bdev_delay_create", 00:05:50.035 "bdev_delay_update_latency", 00:05:50.035 "bdev_split_delete", 00:05:50.035 "bdev_split_create", 00:05:50.035 "bdev_error_inject_error", 00:05:50.035 "bdev_error_delete", 00:05:50.035 "bdev_error_create", 00:05:50.035 "bdev_raid_set_options", 00:05:50.035 "bdev_raid_remove_base_bdev", 00:05:50.035 "bdev_raid_add_base_bdev", 00:05:50.035 "bdev_raid_delete", 00:05:50.035 "bdev_raid_create", 00:05:50.035 "bdev_raid_get_bdevs", 00:05:50.035 "bdev_lvol_grow_lvstore", 00:05:50.035 "bdev_lvol_get_lvols", 00:05:50.035 "bdev_lvol_get_lvstores", 00:05:50.035 "bdev_lvol_delete", 00:05:50.035 "bdev_lvol_set_read_only", 00:05:50.035 "bdev_lvol_resize", 00:05:50.035 "bdev_lvol_decouple_parent", 00:05:50.035 "bdev_lvol_inflate", 00:05:50.035 "bdev_lvol_rename", 00:05:50.035 "bdev_lvol_clone_bdev", 00:05:50.035 "bdev_lvol_clone", 00:05:50.035 "bdev_lvol_snapshot", 00:05:50.035 "bdev_lvol_create", 00:05:50.035 "bdev_lvol_delete_lvstore", 00:05:50.035 "bdev_lvol_rename_lvstore", 00:05:50.035 "bdev_lvol_create_lvstore", 00:05:50.035 "bdev_passthru_delete", 00:05:50.035 "bdev_passthru_create", 00:05:50.035 "bdev_nvme_cuse_unregister", 00:05:50.035 "bdev_nvme_cuse_register", 00:05:50.035 "bdev_opal_new_user", 00:05:50.035 "bdev_opal_set_lock_state", 00:05:50.035 "bdev_opal_delete", 00:05:50.035 "bdev_opal_get_info", 00:05:50.035 "bdev_opal_create", 00:05:50.035 "bdev_nvme_opal_revert", 00:05:50.035 "bdev_nvme_opal_init", 00:05:50.035 "bdev_nvme_send_cmd", 00:05:50.035 "bdev_nvme_get_path_iostat", 00:05:50.035 "bdev_nvme_get_mdns_discovery_info", 00:05:50.035 "bdev_nvme_stop_mdns_discovery", 00:05:50.035 "bdev_nvme_start_mdns_discovery", 00:05:50.035 "bdev_nvme_set_multipath_policy", 00:05:50.035 "bdev_nvme_set_preferred_path", 00:05:50.035 "bdev_nvme_get_io_paths", 00:05:50.035 "bdev_nvme_remove_error_injection", 00:05:50.035 "bdev_nvme_add_error_injection", 00:05:50.035 "bdev_nvme_get_discovery_info", 00:05:50.035 "bdev_nvme_stop_discovery", 00:05:50.035 "bdev_nvme_start_discovery", 00:05:50.035 "bdev_nvme_get_controller_health_info", 00:05:50.035 "bdev_nvme_disable_controller", 00:05:50.035 "bdev_nvme_enable_controller", 00:05:50.035 "bdev_nvme_reset_controller", 00:05:50.035 "bdev_nvme_get_transport_statistics", 00:05:50.035 "bdev_nvme_apply_firmware", 00:05:50.035 "bdev_nvme_detach_controller", 00:05:50.035 "bdev_nvme_get_controllers", 00:05:50.035 "bdev_nvme_attach_controller", 00:05:50.035 "bdev_nvme_set_hotplug", 00:05:50.035 "bdev_nvme_set_options", 00:05:50.035 "bdev_null_resize", 00:05:50.035 "bdev_null_delete", 00:05:50.035 "bdev_null_create", 00:05:50.035 "bdev_malloc_delete", 00:05:50.035 "bdev_malloc_create" 00:05:50.035 ] 00:05:50.035 07:31:00 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:50.035 07:31:00 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:50.035 07:31:00 -- common/autotest_common.sh@10 -- # set +x 00:05:50.035 07:31:00 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:50.035 07:31:00 -- spdkcli/tcp.sh@38 -- # killprocess 1645927 00:05:50.035 07:31:00 -- common/autotest_common.sh@936 -- # '[' -z 1645927 ']' 00:05:50.035 07:31:00 -- common/autotest_common.sh@940 -- # kill -0 1645927 00:05:50.035 07:31:00 -- common/autotest_common.sh@941 -- # uname 00:05:50.035 07:31:00 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:50.035 07:31:00 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1645927 00:05:50.294 07:31:00 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:50.294 07:31:00 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:50.294 07:31:00 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1645927' 00:05:50.294 killing process with pid 1645927 00:05:50.294 07:31:00 -- common/autotest_common.sh@955 -- # kill 1645927 00:05:50.294 07:31:00 -- common/autotest_common.sh@960 -- # wait 1645927 00:05:50.554 00:05:50.554 real 0m1.614s 00:05:50.554 user 0m2.973s 00:05:50.554 sys 0m0.511s 00:05:50.554 07:31:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.554 07:31:01 -- common/autotest_common.sh@10 -- # set +x 00:05:50.554 ************************************ 00:05:50.554 END TEST spdkcli_tcp 00:05:50.554 ************************************ 00:05:50.554 07:31:01 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:50.554 07:31:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.554 07:31:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.554 07:31:01 -- common/autotest_common.sh@10 -- # set +x 00:05:50.554 ************************************ 00:05:50.554 START TEST dpdk_mem_utility 00:05:50.554 ************************************ 00:05:50.554 07:31:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:50.554 * Looking for test storage... 00:05:50.554 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:50.554 07:31:01 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:50.554 07:31:01 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:50.554 07:31:01 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:50.815 07:31:01 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:50.815 07:31:01 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:50.815 07:31:01 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:50.815 07:31:01 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:50.815 07:31:01 -- scripts/common.sh@335 -- # IFS=.-: 00:05:50.815 07:31:01 -- scripts/common.sh@335 -- # read -ra ver1 00:05:50.815 07:31:01 -- scripts/common.sh@336 -- # IFS=.-: 00:05:50.815 07:31:01 -- scripts/common.sh@336 -- # read -ra ver2 00:05:50.815 07:31:01 -- scripts/common.sh@337 -- # local 'op=<' 00:05:50.815 07:31:01 -- scripts/common.sh@339 -- # ver1_l=2 00:05:50.815 07:31:01 -- scripts/common.sh@340 -- # ver2_l=1 00:05:50.815 07:31:01 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:50.815 07:31:01 -- scripts/common.sh@343 -- # case "$op" in 00:05:50.815 07:31:01 -- scripts/common.sh@344 -- # : 1 00:05:50.815 07:31:01 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:50.815 07:31:01 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:50.815 07:31:01 -- scripts/common.sh@364 -- # decimal 1 00:05:50.815 07:31:01 -- scripts/common.sh@352 -- # local d=1 00:05:50.815 07:31:01 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:50.815 07:31:01 -- scripts/common.sh@354 -- # echo 1 00:05:50.815 07:31:01 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:50.815 07:31:01 -- scripts/common.sh@365 -- # decimal 2 00:05:50.815 07:31:01 -- scripts/common.sh@352 -- # local d=2 00:05:50.815 07:31:01 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:50.815 07:31:01 -- scripts/common.sh@354 -- # echo 2 00:05:50.815 07:31:01 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:50.815 07:31:01 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:50.815 07:31:01 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:50.815 07:31:01 -- scripts/common.sh@367 -- # return 0 00:05:50.815 07:31:01 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:50.815 07:31:01 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:50.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.815 --rc genhtml_branch_coverage=1 00:05:50.815 --rc genhtml_function_coverage=1 00:05:50.815 --rc genhtml_legend=1 00:05:50.815 --rc geninfo_all_blocks=1 00:05:50.815 --rc geninfo_unexecuted_blocks=1 00:05:50.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.815 ' 00:05:50.815 07:31:01 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:50.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.815 --rc genhtml_branch_coverage=1 00:05:50.815 --rc genhtml_function_coverage=1 00:05:50.815 --rc genhtml_legend=1 00:05:50.815 --rc geninfo_all_blocks=1 00:05:50.815 --rc geninfo_unexecuted_blocks=1 00:05:50.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.815 ' 00:05:50.815 07:31:01 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:50.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.815 --rc genhtml_branch_coverage=1 00:05:50.815 --rc genhtml_function_coverage=1 00:05:50.815 --rc genhtml_legend=1 00:05:50.815 --rc geninfo_all_blocks=1 00:05:50.815 --rc geninfo_unexecuted_blocks=1 00:05:50.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.815 ' 00:05:50.815 07:31:01 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:50.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.815 --rc genhtml_branch_coverage=1 00:05:50.815 --rc genhtml_function_coverage=1 00:05:50.815 --rc genhtml_legend=1 00:05:50.815 --rc geninfo_all_blocks=1 00:05:50.815 --rc geninfo_unexecuted_blocks=1 00:05:50.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.815 ' 00:05:50.815 07:31:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:50.815 07:31:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1646270 00:05:50.815 07:31:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1646270 00:05:50.815 07:31:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:50.815 07:31:01 -- common/autotest_common.sh@829 -- # '[' -z 1646270 ']' 00:05:50.815 07:31:01 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:50.815 07:31:01 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:50.815 07:31:01 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:50.815 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:50.815 07:31:01 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:50.815 07:31:01 -- common/autotest_common.sh@10 -- # set +x 00:05:50.815 [2024-11-28 07:31:01.376963] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:50.815 [2024-11-28 07:31:01.377041] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1646270 ] 00:05:50.815 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.815 [2024-11-28 07:31:01.444270] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.815 [2024-11-28 07:31:01.479979] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:50.815 [2024-11-28 07:31:01.480111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.755 07:31:02 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:51.755 07:31:02 -- common/autotest_common.sh@862 -- # return 0 00:05:51.755 07:31:02 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:51.755 07:31:02 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:51.755 07:31:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.755 07:31:02 -- common/autotest_common.sh@10 -- # set +x 00:05:51.755 { 00:05:51.755 "filename": "/tmp/spdk_mem_dump.txt" 00:05:51.755 } 00:05:51.755 07:31:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.755 07:31:02 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:51.755 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:51.755 1 heaps totaling size 814.000000 MiB 00:05:51.755 size: 814.000000 MiB heap id: 0 00:05:51.755 end heaps---------- 00:05:51.755 8 mempools totaling size 598.116089 MiB 00:05:51.755 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:51.755 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:51.755 size: 84.521057 MiB name: bdev_io_1646270 00:05:51.755 size: 51.011292 MiB name: evtpool_1646270 00:05:51.755 size: 50.003479 MiB name: msgpool_1646270 00:05:51.755 size: 21.763794 MiB name: PDU_Pool 00:05:51.755 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:51.755 size: 0.026123 MiB name: Session_Pool 00:05:51.755 end mempools------- 00:05:51.755 6 memzones totaling size 4.142822 MiB 00:05:51.755 size: 1.000366 MiB name: RG_ring_0_1646270 00:05:51.755 size: 1.000366 MiB name: RG_ring_1_1646270 00:05:51.755 size: 1.000366 MiB name: RG_ring_4_1646270 00:05:51.755 size: 1.000366 MiB name: RG_ring_5_1646270 00:05:51.755 size: 0.125366 MiB name: RG_ring_2_1646270 00:05:51.755 size: 0.015991 MiB name: RG_ring_3_1646270 00:05:51.755 end memzones------- 00:05:51.755 07:31:02 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:51.755 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:51.755 list of free elements. size: 12.519348 MiB 00:05:51.755 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:51.755 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:51.755 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:51.755 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:51.755 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:51.755 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:51.755 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:51.755 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:51.755 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:51.755 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:51.755 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:51.755 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:51.755 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:51.755 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:51.755 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:51.755 list of standard malloc elements. size: 199.218079 MiB 00:05:51.755 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:51.755 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:51.755 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:51.755 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:51.755 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:51.755 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:51.755 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:51.755 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:51.755 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:51.755 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:51.755 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:51.755 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:51.755 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:51.756 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:51.756 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:51.756 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:51.756 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:51.756 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:51.756 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:51.756 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:51.756 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:51.756 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:51.756 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:51.756 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:51.756 list of memzone associated elements. size: 602.262573 MiB 00:05:51.756 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:51.756 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:51.756 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:51.756 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:51.756 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:51.756 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1646270_0 00:05:51.756 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:51.756 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1646270_0 00:05:51.756 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:51.756 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1646270_0 00:05:51.756 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:51.756 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:51.756 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:51.756 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:51.756 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:51.756 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1646270 00:05:51.756 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:51.756 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1646270 00:05:51.756 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:51.756 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1646270 00:05:51.756 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:51.756 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:51.756 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:51.756 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:51.756 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:51.756 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:51.756 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:51.756 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:51.756 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:51.756 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1646270 00:05:51.756 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:51.756 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1646270 00:05:51.756 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:51.756 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1646270 00:05:51.756 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:51.756 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1646270 00:05:51.756 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:51.756 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1646270 00:05:51.756 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:51.756 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:51.756 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:51.756 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:51.756 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:51.756 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:51.756 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:51.756 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1646270 00:05:51.756 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:51.756 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:51.756 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:51.756 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:51.756 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:51.756 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1646270 00:05:51.756 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:51.756 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:51.756 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:51.756 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1646270 00:05:51.756 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:51.756 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1646270 00:05:51.756 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:51.756 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:51.756 07:31:02 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:51.756 07:31:02 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1646270 00:05:51.756 07:31:02 -- common/autotest_common.sh@936 -- # '[' -z 1646270 ']' 00:05:51.756 07:31:02 -- common/autotest_common.sh@940 -- # kill -0 1646270 00:05:51.756 07:31:02 -- common/autotest_common.sh@941 -- # uname 00:05:51.756 07:31:02 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:51.756 07:31:02 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1646270 00:05:51.756 07:31:02 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:51.756 07:31:02 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:51.756 07:31:02 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1646270' 00:05:51.756 killing process with pid 1646270 00:05:51.756 07:31:02 -- common/autotest_common.sh@955 -- # kill 1646270 00:05:51.756 07:31:02 -- common/autotest_common.sh@960 -- # wait 1646270 00:05:52.015 00:05:52.015 real 0m1.483s 00:05:52.015 user 0m1.509s 00:05:52.015 sys 0m0.458s 00:05:52.015 07:31:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:52.015 07:31:02 -- common/autotest_common.sh@10 -- # set +x 00:05:52.015 ************************************ 00:05:52.015 END TEST dpdk_mem_utility 00:05:52.015 ************************************ 00:05:52.015 07:31:02 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:52.015 07:31:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:52.015 07:31:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.015 07:31:02 -- common/autotest_common.sh@10 -- # set +x 00:05:52.015 ************************************ 00:05:52.015 START TEST event 00:05:52.015 ************************************ 00:05:52.015 07:31:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:52.015 * Looking for test storage... 00:05:52.275 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:52.275 07:31:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:52.275 07:31:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:52.275 07:31:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:52.275 07:31:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:52.275 07:31:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:52.275 07:31:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:52.275 07:31:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:52.275 07:31:02 -- scripts/common.sh@335 -- # IFS=.-: 00:05:52.275 07:31:02 -- scripts/common.sh@335 -- # read -ra ver1 00:05:52.275 07:31:02 -- scripts/common.sh@336 -- # IFS=.-: 00:05:52.275 07:31:02 -- scripts/common.sh@336 -- # read -ra ver2 00:05:52.275 07:31:02 -- scripts/common.sh@337 -- # local 'op=<' 00:05:52.275 07:31:02 -- scripts/common.sh@339 -- # ver1_l=2 00:05:52.275 07:31:02 -- scripts/common.sh@340 -- # ver2_l=1 00:05:52.275 07:31:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:52.275 07:31:02 -- scripts/common.sh@343 -- # case "$op" in 00:05:52.275 07:31:02 -- scripts/common.sh@344 -- # : 1 00:05:52.275 07:31:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:52.275 07:31:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:52.275 07:31:02 -- scripts/common.sh@364 -- # decimal 1 00:05:52.275 07:31:02 -- scripts/common.sh@352 -- # local d=1 00:05:52.275 07:31:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:52.275 07:31:02 -- scripts/common.sh@354 -- # echo 1 00:05:52.275 07:31:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:52.275 07:31:02 -- scripts/common.sh@365 -- # decimal 2 00:05:52.276 07:31:02 -- scripts/common.sh@352 -- # local d=2 00:05:52.276 07:31:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:52.276 07:31:02 -- scripts/common.sh@354 -- # echo 2 00:05:52.276 07:31:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:52.276 07:31:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:52.276 07:31:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:52.276 07:31:02 -- scripts/common.sh@367 -- # return 0 00:05:52.276 07:31:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:52.276 07:31:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:52.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:52.276 --rc genhtml_branch_coverage=1 00:05:52.276 --rc genhtml_function_coverage=1 00:05:52.276 --rc genhtml_legend=1 00:05:52.276 --rc geninfo_all_blocks=1 00:05:52.276 --rc geninfo_unexecuted_blocks=1 00:05:52.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:52.276 ' 00:05:52.276 07:31:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:52.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:52.276 --rc genhtml_branch_coverage=1 00:05:52.276 --rc genhtml_function_coverage=1 00:05:52.276 --rc genhtml_legend=1 00:05:52.276 --rc geninfo_all_blocks=1 00:05:52.276 --rc geninfo_unexecuted_blocks=1 00:05:52.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:52.276 ' 00:05:52.276 07:31:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:52.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:52.276 --rc genhtml_branch_coverage=1 00:05:52.276 --rc genhtml_function_coverage=1 00:05:52.276 --rc genhtml_legend=1 00:05:52.276 --rc geninfo_all_blocks=1 00:05:52.276 --rc geninfo_unexecuted_blocks=1 00:05:52.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:52.276 ' 00:05:52.276 07:31:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:52.276 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:52.276 --rc genhtml_branch_coverage=1 00:05:52.276 --rc genhtml_function_coverage=1 00:05:52.276 --rc genhtml_legend=1 00:05:52.276 --rc geninfo_all_blocks=1 00:05:52.276 --rc geninfo_unexecuted_blocks=1 00:05:52.276 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:52.276 ' 00:05:52.276 07:31:02 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:52.276 07:31:02 -- bdev/nbd_common.sh@6 -- # set -e 00:05:52.276 07:31:02 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:52.276 07:31:02 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:52.276 07:31:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.276 07:31:02 -- common/autotest_common.sh@10 -- # set +x 00:05:52.276 ************************************ 00:05:52.276 START TEST event_perf 00:05:52.276 ************************************ 00:05:52.276 07:31:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:52.276 Running I/O for 1 seconds...[2024-11-28 07:31:02.897839] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:52.276 [2024-11-28 07:31:02.897945] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1646604 ] 00:05:52.276 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.276 [2024-11-28 07:31:02.968375] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:52.276 [2024-11-28 07:31:03.007072] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.276 [2024-11-28 07:31:03.007168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:52.276 [2024-11-28 07:31:03.007252] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:52.276 [2024-11-28 07:31:03.007254] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.657 Running I/O for 1 seconds... 00:05:53.657 lcore 0: 197068 00:05:53.657 lcore 1: 197067 00:05:53.657 lcore 2: 197067 00:05:53.657 lcore 3: 197066 00:05:53.657 done. 00:05:53.657 00:05:53.657 real 0m1.180s 00:05:53.657 user 0m4.085s 00:05:53.657 sys 0m0.091s 00:05:53.657 07:31:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:53.657 07:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.657 ************************************ 00:05:53.657 END TEST event_perf 00:05:53.657 ************************************ 00:05:53.657 07:31:04 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:53.658 07:31:04 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:53.658 07:31:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:53.658 07:31:04 -- common/autotest_common.sh@10 -- # set +x 00:05:53.658 ************************************ 00:05:53.658 START TEST event_reactor 00:05:53.658 ************************************ 00:05:53.658 07:31:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:53.658 [2024-11-28 07:31:04.129535] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:53.658 [2024-11-28 07:31:04.129637] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1646891 ] 00:05:53.658 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.658 [2024-11-28 07:31:04.199490] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.658 [2024-11-28 07:31:04.232914] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.597 test_start 00:05:54.597 oneshot 00:05:54.597 tick 100 00:05:54.597 tick 100 00:05:54.597 tick 250 00:05:54.597 tick 100 00:05:54.597 tick 100 00:05:54.597 tick 100 00:05:54.597 tick 250 00:05:54.597 tick 500 00:05:54.597 tick 100 00:05:54.597 tick 100 00:05:54.597 tick 250 00:05:54.597 tick 100 00:05:54.597 tick 100 00:05:54.597 test_end 00:05:54.597 00:05:54.597 real 0m1.172s 00:05:54.597 user 0m1.078s 00:05:54.597 sys 0m0.089s 00:05:54.597 07:31:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.597 07:31:05 -- common/autotest_common.sh@10 -- # set +x 00:05:54.597 ************************************ 00:05:54.597 END TEST event_reactor 00:05:54.597 ************************************ 00:05:54.597 07:31:05 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:54.597 07:31:05 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:54.597 07:31:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.597 07:31:05 -- common/autotest_common.sh@10 -- # set +x 00:05:54.597 ************************************ 00:05:54.597 START TEST event_reactor_perf 00:05:54.597 ************************************ 00:05:54.597 07:31:05 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:54.597 [2024-11-28 07:31:05.351106] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:54.597 [2024-11-28 07:31:05.351200] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1647177 ] 00:05:54.856 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.856 [2024-11-28 07:31:05.418894] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.856 [2024-11-28 07:31:05.453045] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.795 test_start 00:05:55.795 test_end 00:05:55.795 Performance: 984284 events per second 00:05:55.795 00:05:55.795 real 0m1.172s 00:05:55.795 user 0m1.082s 00:05:55.795 sys 0m0.086s 00:05:55.795 07:31:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.795 07:31:06 -- common/autotest_common.sh@10 -- # set +x 00:05:55.795 ************************************ 00:05:55.795 END TEST event_reactor_perf 00:05:55.795 ************************************ 00:05:55.795 07:31:06 -- event/event.sh@49 -- # uname -s 00:05:55.795 07:31:06 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:55.795 07:31:06 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:55.795 07:31:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:55.795 07:31:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:55.795 07:31:06 -- common/autotest_common.sh@10 -- # set +x 00:05:55.795 ************************************ 00:05:55.795 START TEST event_scheduler 00:05:55.795 ************************************ 00:05:56.056 07:31:06 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:56.056 * Looking for test storage... 00:05:56.056 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:56.056 07:31:06 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:56.056 07:31:06 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:56.056 07:31:06 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:56.056 07:31:06 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:56.056 07:31:06 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:56.056 07:31:06 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:56.056 07:31:06 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:56.056 07:31:06 -- scripts/common.sh@335 -- # IFS=.-: 00:05:56.056 07:31:06 -- scripts/common.sh@335 -- # read -ra ver1 00:05:56.056 07:31:06 -- scripts/common.sh@336 -- # IFS=.-: 00:05:56.056 07:31:06 -- scripts/common.sh@336 -- # read -ra ver2 00:05:56.056 07:31:06 -- scripts/common.sh@337 -- # local 'op=<' 00:05:56.056 07:31:06 -- scripts/common.sh@339 -- # ver1_l=2 00:05:56.056 07:31:06 -- scripts/common.sh@340 -- # ver2_l=1 00:05:56.056 07:31:06 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:56.056 07:31:06 -- scripts/common.sh@343 -- # case "$op" in 00:05:56.056 07:31:06 -- scripts/common.sh@344 -- # : 1 00:05:56.056 07:31:06 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:56.056 07:31:06 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:56.056 07:31:06 -- scripts/common.sh@364 -- # decimal 1 00:05:56.056 07:31:06 -- scripts/common.sh@352 -- # local d=1 00:05:56.056 07:31:06 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:56.056 07:31:06 -- scripts/common.sh@354 -- # echo 1 00:05:56.056 07:31:06 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:56.056 07:31:06 -- scripts/common.sh@365 -- # decimal 2 00:05:56.056 07:31:06 -- scripts/common.sh@352 -- # local d=2 00:05:56.056 07:31:06 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:56.056 07:31:06 -- scripts/common.sh@354 -- # echo 2 00:05:56.056 07:31:06 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:56.056 07:31:06 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:56.056 07:31:06 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:56.056 07:31:06 -- scripts/common.sh@367 -- # return 0 00:05:56.056 07:31:06 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:56.056 07:31:06 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:56.056 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.056 --rc genhtml_branch_coverage=1 00:05:56.056 --rc genhtml_function_coverage=1 00:05:56.056 --rc genhtml_legend=1 00:05:56.056 --rc geninfo_all_blocks=1 00:05:56.056 --rc geninfo_unexecuted_blocks=1 00:05:56.056 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.056 ' 00:05:56.056 07:31:06 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:56.056 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.056 --rc genhtml_branch_coverage=1 00:05:56.056 --rc genhtml_function_coverage=1 00:05:56.056 --rc genhtml_legend=1 00:05:56.056 --rc geninfo_all_blocks=1 00:05:56.056 --rc geninfo_unexecuted_blocks=1 00:05:56.056 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.056 ' 00:05:56.056 07:31:06 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:56.056 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.056 --rc genhtml_branch_coverage=1 00:05:56.056 --rc genhtml_function_coverage=1 00:05:56.056 --rc genhtml_legend=1 00:05:56.056 --rc geninfo_all_blocks=1 00:05:56.056 --rc geninfo_unexecuted_blocks=1 00:05:56.056 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.056 ' 00:05:56.056 07:31:06 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:56.056 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:56.056 --rc genhtml_branch_coverage=1 00:05:56.056 --rc genhtml_function_coverage=1 00:05:56.056 --rc genhtml_legend=1 00:05:56.056 --rc geninfo_all_blocks=1 00:05:56.056 --rc geninfo_unexecuted_blocks=1 00:05:56.056 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:56.056 ' 00:05:56.056 07:31:06 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:56.056 07:31:06 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:56.056 07:31:06 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1647449 00:05:56.056 07:31:06 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:56.056 07:31:06 -- scheduler/scheduler.sh@37 -- # waitforlisten 1647449 00:05:56.056 07:31:06 -- common/autotest_common.sh@829 -- # '[' -z 1647449 ']' 00:05:56.056 07:31:06 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:56.056 07:31:06 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:56.056 07:31:06 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:56.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:56.056 07:31:06 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:56.056 07:31:06 -- common/autotest_common.sh@10 -- # set +x 00:05:56.056 [2024-11-28 07:31:06.741290] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:56.056 [2024-11-28 07:31:06.741345] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1647449 ] 00:05:56.056 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.056 [2024-11-28 07:31:06.801471] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:56.316 [2024-11-28 07:31:06.842220] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.316 [2024-11-28 07:31:06.842304] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:56.316 [2024-11-28 07:31:06.842394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:56.316 [2024-11-28 07:31:06.842396] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:56.316 07:31:06 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:56.316 07:31:06 -- common/autotest_common.sh@862 -- # return 0 00:05:56.316 07:31:06 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:56.317 07:31:06 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:06 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 POWER: Env isn't set yet! 00:05:56.317 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:56.317 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:56.317 POWER: Cannot set governor of lcore 0 to userspace 00:05:56.317 POWER: Attempting to initialise PSTAT power management... 00:05:56.317 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:56.317 POWER: Initialized successfully for lcore 0 power management 00:05:56.317 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:56.317 POWER: Initialized successfully for lcore 1 power management 00:05:56.317 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:56.317 POWER: Initialized successfully for lcore 2 power management 00:05:56.317 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:56.317 POWER: Initialized successfully for lcore 3 power management 00:05:56.317 [2024-11-28 07:31:06.947829] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:56.317 [2024-11-28 07:31:06.947844] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:56.317 [2024-11-28 07:31:06.947854] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:56.317 07:31:06 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:06 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:56.317 07:31:06 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:06 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 [2024-11-28 07:31:07.010088] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:56.317 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:56.317 07:31:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:56.317 07:31:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 ************************************ 00:05:56.317 START TEST scheduler_create_thread 00:05:56.317 ************************************ 00:05:56.317 07:31:07 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:56.317 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 2 00:05:56.317 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:56.317 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 3 00:05:56.317 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:56.317 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 4 00:05:56.317 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:56.317 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 5 00:05:56.317 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:56.317 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.317 6 00:05:56.317 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.317 07:31:07 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:56.317 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.317 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.576 7 00:05:56.576 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.576 07:31:07 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:56.576 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.576 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.576 8 00:05:56.576 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.576 07:31:07 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:56.576 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.576 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.576 9 00:05:56.576 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.576 07:31:07 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:56.576 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.576 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.576 10 00:05:56.576 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.576 07:31:07 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:56.576 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.576 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:56.576 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:56.576 07:31:07 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:56.576 07:31:07 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:56.576 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:56.576 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:57.515 07:31:07 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.515 07:31:07 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:57.515 07:31:07 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.515 07:31:07 -- common/autotest_common.sh@10 -- # set +x 00:05:58.895 07:31:09 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.895 07:31:09 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:58.895 07:31:09 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:58.895 07:31:09 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.895 07:31:09 -- common/autotest_common.sh@10 -- # set +x 00:05:59.833 07:31:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:59.833 00:05:59.833 real 0m3.382s 00:05:59.833 user 0m0.019s 00:05:59.833 sys 0m0.010s 00:05:59.833 07:31:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.833 07:31:10 -- common/autotest_common.sh@10 -- # set +x 00:05:59.833 ************************************ 00:05:59.833 END TEST scheduler_create_thread 00:05:59.833 ************************************ 00:05:59.833 07:31:10 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:59.833 07:31:10 -- scheduler/scheduler.sh@46 -- # killprocess 1647449 00:05:59.833 07:31:10 -- common/autotest_common.sh@936 -- # '[' -z 1647449 ']' 00:05:59.833 07:31:10 -- common/autotest_common.sh@940 -- # kill -0 1647449 00:05:59.833 07:31:10 -- common/autotest_common.sh@941 -- # uname 00:05:59.833 07:31:10 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:59.833 07:31:10 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1647449 00:05:59.833 07:31:10 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:59.833 07:31:10 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:59.833 07:31:10 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1647449' 00:05:59.833 killing process with pid 1647449 00:05:59.833 07:31:10 -- common/autotest_common.sh@955 -- # kill 1647449 00:05:59.833 07:31:10 -- common/autotest_common.sh@960 -- # wait 1647449 00:06:00.091 [2024-11-28 07:31:10.782001] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:00.351 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:00.351 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:00.351 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:00.351 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:00.351 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:00.351 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:00.351 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:00.351 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:00.351 00:06:00.351 real 0m4.432s 00:06:00.351 user 0m7.841s 00:06:00.351 sys 0m0.381s 00:06:00.351 07:31:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:00.351 07:31:10 -- common/autotest_common.sh@10 -- # set +x 00:06:00.351 ************************************ 00:06:00.351 END TEST event_scheduler 00:06:00.351 ************************************ 00:06:00.351 07:31:11 -- event/event.sh@51 -- # modprobe -n nbd 00:06:00.351 07:31:11 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:00.351 07:31:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:00.351 07:31:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.351 07:31:11 -- common/autotest_common.sh@10 -- # set +x 00:06:00.351 ************************************ 00:06:00.351 START TEST app_repeat 00:06:00.351 ************************************ 00:06:00.351 07:31:11 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:06:00.351 07:31:11 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:00.351 07:31:11 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:00.351 07:31:11 -- event/event.sh@13 -- # local nbd_list 00:06:00.351 07:31:11 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:00.351 07:31:11 -- event/event.sh@14 -- # local bdev_list 00:06:00.351 07:31:11 -- event/event.sh@15 -- # local repeat_times=4 00:06:00.351 07:31:11 -- event/event.sh@17 -- # modprobe nbd 00:06:00.351 07:31:11 -- event/event.sh@19 -- # repeat_pid=1648191 00:06:00.351 07:31:11 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:00.351 07:31:11 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:00.351 07:31:11 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1648191' 00:06:00.351 Process app_repeat pid: 1648191 00:06:00.351 07:31:11 -- event/event.sh@23 -- # for i in {0..2} 00:06:00.351 07:31:11 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:00.351 spdk_app_start Round 0 00:06:00.351 07:31:11 -- event/event.sh@25 -- # waitforlisten 1648191 /var/tmp/spdk-nbd.sock 00:06:00.351 07:31:11 -- common/autotest_common.sh@829 -- # '[' -z 1648191 ']' 00:06:00.351 07:31:11 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:00.351 07:31:11 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:00.351 07:31:11 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:00.351 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:00.351 07:31:11 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:00.351 07:31:11 -- common/autotest_common.sh@10 -- # set +x 00:06:00.351 [2024-11-28 07:31:11.084405] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:00.351 [2024-11-28 07:31:11.084501] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648191 ] 00:06:00.351 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.611 [2024-11-28 07:31:11.154243] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:00.611 [2024-11-28 07:31:11.192399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:00.611 [2024-11-28 07:31:11.192402] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.179 07:31:11 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:01.179 07:31:11 -- common/autotest_common.sh@862 -- # return 0 00:06:01.179 07:31:11 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:01.438 Malloc0 00:06:01.438 07:31:12 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:01.698 Malloc1 00:06:01.698 07:31:12 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@12 -- # local i 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:01.698 07:31:12 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:01.957 /dev/nbd0 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:01.957 07:31:12 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:01.957 07:31:12 -- common/autotest_common.sh@867 -- # local i 00:06:01.957 07:31:12 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:01.957 07:31:12 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:01.957 07:31:12 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:01.957 07:31:12 -- common/autotest_common.sh@871 -- # break 00:06:01.957 07:31:12 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:01.957 07:31:12 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:01.957 07:31:12 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:01.957 1+0 records in 00:06:01.957 1+0 records out 00:06:01.957 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217436 s, 18.8 MB/s 00:06:01.957 07:31:12 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:01.957 07:31:12 -- common/autotest_common.sh@884 -- # size=4096 00:06:01.957 07:31:12 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:01.957 07:31:12 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:01.957 07:31:12 -- common/autotest_common.sh@887 -- # return 0 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:01.957 /dev/nbd1 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:01.957 07:31:12 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:01.957 07:31:12 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:01.957 07:31:12 -- common/autotest_common.sh@867 -- # local i 00:06:01.957 07:31:12 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:01.957 07:31:12 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:01.957 07:31:12 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:02.216 07:31:12 -- common/autotest_common.sh@871 -- # break 00:06:02.217 07:31:12 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:02.217 07:31:12 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:02.217 07:31:12 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.217 1+0 records in 00:06:02.217 1+0 records out 00:06:02.217 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0001723 s, 23.8 MB/s 00:06:02.217 07:31:12 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.217 07:31:12 -- common/autotest_common.sh@884 -- # size=4096 00:06:02.217 07:31:12 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.217 07:31:12 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:02.217 07:31:12 -- common/autotest_common.sh@887 -- # return 0 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:02.217 { 00:06:02.217 "nbd_device": "/dev/nbd0", 00:06:02.217 "bdev_name": "Malloc0" 00:06:02.217 }, 00:06:02.217 { 00:06:02.217 "nbd_device": "/dev/nbd1", 00:06:02.217 "bdev_name": "Malloc1" 00:06:02.217 } 00:06:02.217 ]' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:02.217 { 00:06:02.217 "nbd_device": "/dev/nbd0", 00:06:02.217 "bdev_name": "Malloc0" 00:06:02.217 }, 00:06:02.217 { 00:06:02.217 "nbd_device": "/dev/nbd1", 00:06:02.217 "bdev_name": "Malloc1" 00:06:02.217 } 00:06:02.217 ]' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:02.217 /dev/nbd1' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:02.217 /dev/nbd1' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@65 -- # count=2 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@95 -- # count=2 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:02.217 07:31:12 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:02.477 256+0 records in 00:06:02.477 256+0 records out 00:06:02.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115613 s, 90.7 MB/s 00:06:02.477 07:31:12 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:02.477 07:31:12 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:02.477 256+0 records in 00:06:02.477 256+0 records out 00:06:02.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019393 s, 54.1 MB/s 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:02.477 256+0 records in 00:06:02.477 256+0 records out 00:06:02.477 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208321 s, 50.3 MB/s 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@51 -- # local i 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:02.477 07:31:13 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@41 -- # break 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@45 -- # return 0 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@41 -- # break 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@45 -- # return 0 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.737 07:31:13 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@65 -- # true 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@65 -- # count=0 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@104 -- # count=0 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:02.996 07:31:13 -- bdev/nbd_common.sh@109 -- # return 0 00:06:02.996 07:31:13 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:03.255 07:31:13 -- event/event.sh@35 -- # sleep 3 00:06:03.514 [2024-11-28 07:31:14.026731] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:03.514 [2024-11-28 07:31:14.060722] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:03.514 [2024-11-28 07:31:14.060725] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.514 [2024-11-28 07:31:14.100539] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:03.514 [2024-11-28 07:31:14.100583] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:06.805 07:31:16 -- event/event.sh@23 -- # for i in {0..2} 00:06:06.805 07:31:16 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:06.805 spdk_app_start Round 1 00:06:06.805 07:31:16 -- event/event.sh@25 -- # waitforlisten 1648191 /var/tmp/spdk-nbd.sock 00:06:06.805 07:31:16 -- common/autotest_common.sh@829 -- # '[' -z 1648191 ']' 00:06:06.805 07:31:16 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:06.805 07:31:16 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:06.805 07:31:16 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:06.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:06.805 07:31:16 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:06.805 07:31:16 -- common/autotest_common.sh@10 -- # set +x 00:06:06.805 07:31:17 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:06.805 07:31:17 -- common/autotest_common.sh@862 -- # return 0 00:06:06.805 07:31:17 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:06.805 Malloc0 00:06:06.805 07:31:17 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:06.805 Malloc1 00:06:06.805 07:31:17 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@12 -- # local i 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:06.805 07:31:17 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:06.805 /dev/nbd0 00:06:07.065 07:31:17 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:07.065 07:31:17 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:07.065 07:31:17 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:07.065 07:31:17 -- common/autotest_common.sh@867 -- # local i 00:06:07.065 07:31:17 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:07.065 07:31:17 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:07.065 07:31:17 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:07.065 07:31:17 -- common/autotest_common.sh@871 -- # break 00:06:07.065 07:31:17 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:07.065 07:31:17 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:07.066 07:31:17 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.066 1+0 records in 00:06:07.066 1+0 records out 00:06:07.066 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217938 s, 18.8 MB/s 00:06:07.066 07:31:17 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.066 07:31:17 -- common/autotest_common.sh@884 -- # size=4096 00:06:07.066 07:31:17 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.066 07:31:17 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:07.066 07:31:17 -- common/autotest_common.sh@887 -- # return 0 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:07.066 /dev/nbd1 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:07.066 07:31:17 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:07.066 07:31:17 -- common/autotest_common.sh@867 -- # local i 00:06:07.066 07:31:17 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:07.066 07:31:17 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:07.066 07:31:17 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:07.066 07:31:17 -- common/autotest_common.sh@871 -- # break 00:06:07.066 07:31:17 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:07.066 07:31:17 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:07.066 07:31:17 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.066 1+0 records in 00:06:07.066 1+0 records out 00:06:07.066 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233787 s, 17.5 MB/s 00:06:07.066 07:31:17 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.066 07:31:17 -- common/autotest_common.sh@884 -- # size=4096 00:06:07.066 07:31:17 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.066 07:31:17 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:07.066 07:31:17 -- common/autotest_common.sh@887 -- # return 0 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.066 07:31:17 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:07.326 07:31:17 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:07.326 { 00:06:07.326 "nbd_device": "/dev/nbd0", 00:06:07.326 "bdev_name": "Malloc0" 00:06:07.326 }, 00:06:07.326 { 00:06:07.326 "nbd_device": "/dev/nbd1", 00:06:07.326 "bdev_name": "Malloc1" 00:06:07.326 } 00:06:07.326 ]' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:07.326 { 00:06:07.326 "nbd_device": "/dev/nbd0", 00:06:07.326 "bdev_name": "Malloc0" 00:06:07.326 }, 00:06:07.326 { 00:06:07.326 "nbd_device": "/dev/nbd1", 00:06:07.326 "bdev_name": "Malloc1" 00:06:07.326 } 00:06:07.326 ]' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:07.326 /dev/nbd1' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:07.326 /dev/nbd1' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@65 -- # count=2 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@95 -- # count=2 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:07.326 256+0 records in 00:06:07.326 256+0 records out 00:06:07.326 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104476 s, 100 MB/s 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:07.326 256+0 records in 00:06:07.326 256+0 records out 00:06:07.326 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197289 s, 53.1 MB/s 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:07.326 07:31:18 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:07.586 256+0 records in 00:06:07.586 256+0 records out 00:06:07.586 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206455 s, 50.8 MB/s 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@51 -- # local i 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@41 -- # break 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@45 -- # return 0 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:07.586 07:31:18 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:07.845 07:31:18 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:07.845 07:31:18 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:07.845 07:31:18 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:07.845 07:31:18 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:07.845 07:31:18 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:07.846 07:31:18 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:07.846 07:31:18 -- bdev/nbd_common.sh@41 -- # break 00:06:07.846 07:31:18 -- bdev/nbd_common.sh@45 -- # return 0 00:06:07.846 07:31:18 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:07.846 07:31:18 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.846 07:31:18 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@65 -- # true 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@65 -- # count=0 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@104 -- # count=0 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:08.105 07:31:18 -- bdev/nbd_common.sh@109 -- # return 0 00:06:08.105 07:31:18 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:08.364 07:31:18 -- event/event.sh@35 -- # sleep 3 00:06:08.364 [2024-11-28 07:31:19.132036] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:08.623 [2024-11-28 07:31:19.164741] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:08.623 [2024-11-28 07:31:19.164743] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.623 [2024-11-28 07:31:19.204334] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:08.623 [2024-11-28 07:31:19.204375] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:11.912 07:31:21 -- event/event.sh@23 -- # for i in {0..2} 00:06:11.912 07:31:21 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:11.912 spdk_app_start Round 2 00:06:11.912 07:31:21 -- event/event.sh@25 -- # waitforlisten 1648191 /var/tmp/spdk-nbd.sock 00:06:11.912 07:31:21 -- common/autotest_common.sh@829 -- # '[' -z 1648191 ']' 00:06:11.912 07:31:21 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:11.912 07:31:21 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:11.912 07:31:21 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:11.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:11.912 07:31:21 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:11.912 07:31:21 -- common/autotest_common.sh@10 -- # set +x 00:06:11.912 07:31:22 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:11.912 07:31:22 -- common/autotest_common.sh@862 -- # return 0 00:06:11.912 07:31:22 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:11.912 Malloc0 00:06:11.912 07:31:22 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:11.912 Malloc1 00:06:11.912 07:31:22 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@12 -- # local i 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:11.912 07:31:22 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:12.171 /dev/nbd0 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:12.171 07:31:22 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:12.171 07:31:22 -- common/autotest_common.sh@867 -- # local i 00:06:12.171 07:31:22 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:12.171 07:31:22 -- common/autotest_common.sh@871 -- # break 00:06:12.171 07:31:22 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:12.171 1+0 records in 00:06:12.171 1+0 records out 00:06:12.171 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240772 s, 17.0 MB/s 00:06:12.171 07:31:22 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:12.171 07:31:22 -- common/autotest_common.sh@884 -- # size=4096 00:06:12.171 07:31:22 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:12.171 07:31:22 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:12.171 07:31:22 -- common/autotest_common.sh@887 -- # return 0 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:12.171 /dev/nbd1 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:12.171 07:31:22 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:12.171 07:31:22 -- common/autotest_common.sh@867 -- # local i 00:06:12.171 07:31:22 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:12.171 07:31:22 -- common/autotest_common.sh@871 -- # break 00:06:12.171 07:31:22 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:12.171 07:31:22 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:12.171 1+0 records in 00:06:12.171 1+0 records out 00:06:12.171 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235455 s, 17.4 MB/s 00:06:12.171 07:31:22 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:12.171 07:31:22 -- common/autotest_common.sh@884 -- # size=4096 00:06:12.171 07:31:22 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:12.171 07:31:22 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:12.171 07:31:22 -- common/autotest_common.sh@887 -- # return 0 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:12.171 07:31:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.430 07:31:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:12.430 { 00:06:12.430 "nbd_device": "/dev/nbd0", 00:06:12.430 "bdev_name": "Malloc0" 00:06:12.430 }, 00:06:12.430 { 00:06:12.430 "nbd_device": "/dev/nbd1", 00:06:12.430 "bdev_name": "Malloc1" 00:06:12.430 } 00:06:12.430 ]' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:12.430 { 00:06:12.430 "nbd_device": "/dev/nbd0", 00:06:12.430 "bdev_name": "Malloc0" 00:06:12.430 }, 00:06:12.430 { 00:06:12.430 "nbd_device": "/dev/nbd1", 00:06:12.430 "bdev_name": "Malloc1" 00:06:12.430 } 00:06:12.430 ]' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:12.430 /dev/nbd1' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:12.430 /dev/nbd1' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@65 -- # count=2 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@95 -- # count=2 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:12.430 256+0 records in 00:06:12.430 256+0 records out 00:06:12.430 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106874 s, 98.1 MB/s 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:12.430 07:31:23 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:12.430 256+0 records in 00:06:12.430 256+0 records out 00:06:12.430 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200892 s, 52.2 MB/s 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:12.690 256+0 records in 00:06:12.690 256+0 records out 00:06:12.690 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217516 s, 48.2 MB/s 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@51 -- # local i 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@41 -- # break 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@45 -- # return 0 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:12.690 07:31:23 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@41 -- # break 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@45 -- # return 0 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.949 07:31:23 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@65 -- # true 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@65 -- # count=0 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@104 -- # count=0 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:13.209 07:31:23 -- bdev/nbd_common.sh@109 -- # return 0 00:06:13.209 07:31:23 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:13.467 07:31:24 -- event/event.sh@35 -- # sleep 3 00:06:13.467 [2024-11-28 07:31:24.236712] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:13.725 [2024-11-28 07:31:24.269427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:13.725 [2024-11-28 07:31:24.269429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.725 [2024-11-28 07:31:24.309106] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:13.725 [2024-11-28 07:31:24.309149] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:17.014 07:31:27 -- event/event.sh@38 -- # waitforlisten 1648191 /var/tmp/spdk-nbd.sock 00:06:17.014 07:31:27 -- common/autotest_common.sh@829 -- # '[' -z 1648191 ']' 00:06:17.014 07:31:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:17.014 07:31:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:17.014 07:31:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:17.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:17.014 07:31:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:17.014 07:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:17.014 07:31:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.014 07:31:27 -- common/autotest_common.sh@862 -- # return 0 00:06:17.014 07:31:27 -- event/event.sh@39 -- # killprocess 1648191 00:06:17.014 07:31:27 -- common/autotest_common.sh@936 -- # '[' -z 1648191 ']' 00:06:17.014 07:31:27 -- common/autotest_common.sh@940 -- # kill -0 1648191 00:06:17.014 07:31:27 -- common/autotest_common.sh@941 -- # uname 00:06:17.014 07:31:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:17.014 07:31:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1648191 00:06:17.014 07:31:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:17.014 07:31:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:17.014 07:31:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1648191' 00:06:17.014 killing process with pid 1648191 00:06:17.014 07:31:27 -- common/autotest_common.sh@955 -- # kill 1648191 00:06:17.014 07:31:27 -- common/autotest_common.sh@960 -- # wait 1648191 00:06:17.014 spdk_app_start is called in Round 0. 00:06:17.014 Shutdown signal received, stop current app iteration 00:06:17.014 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:17.014 spdk_app_start is called in Round 1. 00:06:17.014 Shutdown signal received, stop current app iteration 00:06:17.014 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:17.014 spdk_app_start is called in Round 2. 00:06:17.014 Shutdown signal received, stop current app iteration 00:06:17.014 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:17.014 spdk_app_start is called in Round 3. 00:06:17.014 Shutdown signal received, stop current app iteration 00:06:17.014 07:31:27 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:17.014 07:31:27 -- event/event.sh@42 -- # return 0 00:06:17.014 00:06:17.014 real 0m16.398s 00:06:17.014 user 0m35.156s 00:06:17.014 sys 0m3.031s 00:06:17.014 07:31:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:17.014 07:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:17.014 ************************************ 00:06:17.015 END TEST app_repeat 00:06:17.015 ************************************ 00:06:17.015 07:31:27 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:17.015 07:31:27 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:17.015 07:31:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:17.015 07:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:17.015 ************************************ 00:06:17.015 START TEST cpu_locks 00:06:17.015 ************************************ 00:06:17.015 07:31:27 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:17.015 * Looking for test storage... 00:06:17.015 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:17.015 07:31:27 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:17.015 07:31:27 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:17.015 07:31:27 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:17.015 07:31:27 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:17.015 07:31:27 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:17.015 07:31:27 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:17.015 07:31:27 -- scripts/common.sh@335 -- # IFS=.-: 00:06:17.015 07:31:27 -- scripts/common.sh@335 -- # read -ra ver1 00:06:17.015 07:31:27 -- scripts/common.sh@336 -- # IFS=.-: 00:06:17.015 07:31:27 -- scripts/common.sh@336 -- # read -ra ver2 00:06:17.015 07:31:27 -- scripts/common.sh@337 -- # local 'op=<' 00:06:17.015 07:31:27 -- scripts/common.sh@339 -- # ver1_l=2 00:06:17.015 07:31:27 -- scripts/common.sh@340 -- # ver2_l=1 00:06:17.015 07:31:27 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:17.015 07:31:27 -- scripts/common.sh@343 -- # case "$op" in 00:06:17.015 07:31:27 -- scripts/common.sh@344 -- # : 1 00:06:17.015 07:31:27 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:17.015 07:31:27 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:17.015 07:31:27 -- scripts/common.sh@364 -- # decimal 1 00:06:17.015 07:31:27 -- scripts/common.sh@352 -- # local d=1 00:06:17.015 07:31:27 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:17.015 07:31:27 -- scripts/common.sh@354 -- # echo 1 00:06:17.015 07:31:27 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:17.015 07:31:27 -- scripts/common.sh@365 -- # decimal 2 00:06:17.015 07:31:27 -- scripts/common.sh@352 -- # local d=2 00:06:17.015 07:31:27 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:17.015 07:31:27 -- scripts/common.sh@354 -- # echo 2 00:06:17.015 07:31:27 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:17.015 07:31:27 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:17.015 07:31:27 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:17.015 07:31:27 -- scripts/common.sh@367 -- # return 0 00:06:17.015 07:31:27 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:17.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.015 --rc genhtml_branch_coverage=1 00:06:17.015 --rc genhtml_function_coverage=1 00:06:17.015 --rc genhtml_legend=1 00:06:17.015 --rc geninfo_all_blocks=1 00:06:17.015 --rc geninfo_unexecuted_blocks=1 00:06:17.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.015 ' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:17.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.015 --rc genhtml_branch_coverage=1 00:06:17.015 --rc genhtml_function_coverage=1 00:06:17.015 --rc genhtml_legend=1 00:06:17.015 --rc geninfo_all_blocks=1 00:06:17.015 --rc geninfo_unexecuted_blocks=1 00:06:17.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.015 ' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:17.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.015 --rc genhtml_branch_coverage=1 00:06:17.015 --rc genhtml_function_coverage=1 00:06:17.015 --rc genhtml_legend=1 00:06:17.015 --rc geninfo_all_blocks=1 00:06:17.015 --rc geninfo_unexecuted_blocks=1 00:06:17.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.015 ' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:17.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.015 --rc genhtml_branch_coverage=1 00:06:17.015 --rc genhtml_function_coverage=1 00:06:17.015 --rc genhtml_legend=1 00:06:17.015 --rc geninfo_all_blocks=1 00:06:17.015 --rc geninfo_unexecuted_blocks=1 00:06:17.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.015 ' 00:06:17.015 07:31:27 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:17.015 07:31:27 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:17.015 07:31:27 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:17.015 07:31:27 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:17.015 07:31:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:17.015 07:31:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:17.015 07:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:17.015 ************************************ 00:06:17.015 START TEST default_locks 00:06:17.015 ************************************ 00:06:17.015 07:31:27 -- common/autotest_common.sh@1114 -- # default_locks 00:06:17.015 07:31:27 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1651329 00:06:17.015 07:31:27 -- event/cpu_locks.sh@47 -- # waitforlisten 1651329 00:06:17.015 07:31:27 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:17.015 07:31:27 -- common/autotest_common.sh@829 -- # '[' -z 1651329 ']' 00:06:17.015 07:31:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:17.015 07:31:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:17.015 07:31:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:17.015 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:17.015 07:31:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:17.015 07:31:27 -- common/autotest_common.sh@10 -- # set +x 00:06:17.015 [2024-11-28 07:31:27.730471] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:17.015 [2024-11-28 07:31:27.730560] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1651329 ] 00:06:17.015 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.275 [2024-11-28 07:31:27.798759] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.275 [2024-11-28 07:31:27.836135] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:17.275 [2024-11-28 07:31:27.836259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.844 07:31:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.844 07:31:28 -- common/autotest_common.sh@862 -- # return 0 00:06:17.844 07:31:28 -- event/cpu_locks.sh@49 -- # locks_exist 1651329 00:06:17.844 07:31:28 -- event/cpu_locks.sh@22 -- # lslocks -p 1651329 00:06:17.844 07:31:28 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:18.413 lslocks: write error 00:06:18.413 07:31:29 -- event/cpu_locks.sh@50 -- # killprocess 1651329 00:06:18.413 07:31:29 -- common/autotest_common.sh@936 -- # '[' -z 1651329 ']' 00:06:18.413 07:31:29 -- common/autotest_common.sh@940 -- # kill -0 1651329 00:06:18.413 07:31:29 -- common/autotest_common.sh@941 -- # uname 00:06:18.413 07:31:29 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.413 07:31:29 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1651329 00:06:18.413 07:31:29 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:18.413 07:31:29 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:18.413 07:31:29 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1651329' 00:06:18.413 killing process with pid 1651329 00:06:18.413 07:31:29 -- common/autotest_common.sh@955 -- # kill 1651329 00:06:18.413 07:31:29 -- common/autotest_common.sh@960 -- # wait 1651329 00:06:18.982 07:31:29 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1651329 00:06:18.982 07:31:29 -- common/autotest_common.sh@650 -- # local es=0 00:06:18.982 07:31:29 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1651329 00:06:18.982 07:31:29 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:18.982 07:31:29 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:18.982 07:31:29 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:18.982 07:31:29 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:18.982 07:31:29 -- common/autotest_common.sh@653 -- # waitforlisten 1651329 00:06:18.982 07:31:29 -- common/autotest_common.sh@829 -- # '[' -z 1651329 ']' 00:06:18.982 07:31:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.982 07:31:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.982 07:31:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.982 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.982 07:31:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.982 07:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:18.983 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1651329) - No such process 00:06:18.983 ERROR: process (pid: 1651329) is no longer running 00:06:18.983 07:31:29 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:18.983 07:31:29 -- common/autotest_common.sh@862 -- # return 1 00:06:18.983 07:31:29 -- common/autotest_common.sh@653 -- # es=1 00:06:18.983 07:31:29 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:18.983 07:31:29 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:18.983 07:31:29 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:18.983 07:31:29 -- event/cpu_locks.sh@54 -- # no_locks 00:06:18.983 07:31:29 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:18.983 07:31:29 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:18.983 07:31:29 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:18.983 00:06:18.983 real 0m1.770s 00:06:18.983 user 0m1.874s 00:06:18.983 sys 0m0.684s 00:06:18.983 07:31:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.983 07:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:18.983 ************************************ 00:06:18.983 END TEST default_locks 00:06:18.983 ************************************ 00:06:18.983 07:31:29 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:18.983 07:31:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.983 07:31:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.983 07:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:18.983 ************************************ 00:06:18.983 START TEST default_locks_via_rpc 00:06:18.983 ************************************ 00:06:18.983 07:31:29 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:18.983 07:31:29 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1651800 00:06:18.983 07:31:29 -- event/cpu_locks.sh@63 -- # waitforlisten 1651800 00:06:18.983 07:31:29 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:18.983 07:31:29 -- common/autotest_common.sh@829 -- # '[' -z 1651800 ']' 00:06:18.983 07:31:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.983 07:31:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.983 07:31:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.983 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.983 07:31:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.983 07:31:29 -- common/autotest_common.sh@10 -- # set +x 00:06:18.983 [2024-11-28 07:31:29.551435] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:18.983 [2024-11-28 07:31:29.551525] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1651800 ] 00:06:18.983 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.983 [2024-11-28 07:31:29.619245] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.983 [2024-11-28 07:31:29.654897] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:18.983 [2024-11-28 07:31:29.655022] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.923 07:31:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:19.923 07:31:30 -- common/autotest_common.sh@862 -- # return 0 00:06:19.923 07:31:30 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:19.923 07:31:30 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:19.923 07:31:30 -- common/autotest_common.sh@10 -- # set +x 00:06:19.923 07:31:30 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:19.923 07:31:30 -- event/cpu_locks.sh@67 -- # no_locks 00:06:19.923 07:31:30 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:19.923 07:31:30 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:19.923 07:31:30 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:19.923 07:31:30 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:19.923 07:31:30 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:19.923 07:31:30 -- common/autotest_common.sh@10 -- # set +x 00:06:19.923 07:31:30 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:19.923 07:31:30 -- event/cpu_locks.sh@71 -- # locks_exist 1651800 00:06:19.923 07:31:30 -- event/cpu_locks.sh@22 -- # lslocks -p 1651800 00:06:19.923 07:31:30 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:19.923 07:31:30 -- event/cpu_locks.sh@73 -- # killprocess 1651800 00:06:19.923 07:31:30 -- common/autotest_common.sh@936 -- # '[' -z 1651800 ']' 00:06:19.923 07:31:30 -- common/autotest_common.sh@940 -- # kill -0 1651800 00:06:19.923 07:31:30 -- common/autotest_common.sh@941 -- # uname 00:06:19.923 07:31:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:19.923 07:31:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1651800 00:06:20.183 07:31:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:20.183 07:31:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:20.183 07:31:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1651800' 00:06:20.183 killing process with pid 1651800 00:06:20.183 07:31:30 -- common/autotest_common.sh@955 -- # kill 1651800 00:06:20.183 07:31:30 -- common/autotest_common.sh@960 -- # wait 1651800 00:06:20.443 00:06:20.443 real 0m1.510s 00:06:20.443 user 0m1.589s 00:06:20.443 sys 0m0.509s 00:06:20.443 07:31:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:20.443 07:31:31 -- common/autotest_common.sh@10 -- # set +x 00:06:20.443 ************************************ 00:06:20.443 END TEST default_locks_via_rpc 00:06:20.443 ************************************ 00:06:20.443 07:31:31 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:20.443 07:31:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:20.443 07:31:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:20.443 07:31:31 -- common/autotest_common.sh@10 -- # set +x 00:06:20.443 ************************************ 00:06:20.443 START TEST non_locking_app_on_locked_coremask 00:06:20.443 ************************************ 00:06:20.443 07:31:31 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:20.443 07:31:31 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1652111 00:06:20.443 07:31:31 -- event/cpu_locks.sh@81 -- # waitforlisten 1652111 /var/tmp/spdk.sock 00:06:20.443 07:31:31 -- common/autotest_common.sh@829 -- # '[' -z 1652111 ']' 00:06:20.443 07:31:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:20.443 07:31:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:20.443 07:31:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:20.443 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:20.443 07:31:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:20.443 07:31:31 -- common/autotest_common.sh@10 -- # set +x 00:06:20.443 07:31:31 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:20.443 [2024-11-28 07:31:31.104169] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:20.443 [2024-11-28 07:31:31.104260] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652111 ] 00:06:20.443 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.443 [2024-11-28 07:31:31.171946] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.443 [2024-11-28 07:31:31.209971] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:20.443 [2024-11-28 07:31:31.210092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.381 07:31:31 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:21.381 07:31:31 -- common/autotest_common.sh@862 -- # return 0 00:06:21.381 07:31:31 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1652176 00:06:21.381 07:31:31 -- event/cpu_locks.sh@85 -- # waitforlisten 1652176 /var/tmp/spdk2.sock 00:06:21.381 07:31:31 -- common/autotest_common.sh@829 -- # '[' -z 1652176 ']' 00:06:21.381 07:31:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:21.381 07:31:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:21.381 07:31:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:21.381 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:21.381 07:31:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:21.381 07:31:31 -- common/autotest_common.sh@10 -- # set +x 00:06:21.381 07:31:31 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:21.381 [2024-11-28 07:31:31.945477] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:21.381 [2024-11-28 07:31:31.945541] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652176 ] 00:06:21.381 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.381 [2024-11-28 07:31:32.034296] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:21.381 [2024-11-28 07:31:32.034324] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.381 [2024-11-28 07:31:32.111610] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.381 [2024-11-28 07:31:32.111728] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.351 07:31:32 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:22.351 07:31:32 -- common/autotest_common.sh@862 -- # return 0 00:06:22.351 07:31:32 -- event/cpu_locks.sh@87 -- # locks_exist 1652111 00:06:22.351 07:31:32 -- event/cpu_locks.sh@22 -- # lslocks -p 1652111 00:06:22.351 07:31:32 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:23.005 lslocks: write error 00:06:23.005 07:31:33 -- event/cpu_locks.sh@89 -- # killprocess 1652111 00:06:23.005 07:31:33 -- common/autotest_common.sh@936 -- # '[' -z 1652111 ']' 00:06:23.005 07:31:33 -- common/autotest_common.sh@940 -- # kill -0 1652111 00:06:23.005 07:31:33 -- common/autotest_common.sh@941 -- # uname 00:06:23.005 07:31:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:23.005 07:31:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1652111 00:06:23.005 07:31:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:23.005 07:31:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:23.005 07:31:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1652111' 00:06:23.005 killing process with pid 1652111 00:06:23.005 07:31:33 -- common/autotest_common.sh@955 -- # kill 1652111 00:06:23.005 07:31:33 -- common/autotest_common.sh@960 -- # wait 1652111 00:06:23.574 07:31:34 -- event/cpu_locks.sh@90 -- # killprocess 1652176 00:06:23.574 07:31:34 -- common/autotest_common.sh@936 -- # '[' -z 1652176 ']' 00:06:23.574 07:31:34 -- common/autotest_common.sh@940 -- # kill -0 1652176 00:06:23.574 07:31:34 -- common/autotest_common.sh@941 -- # uname 00:06:23.574 07:31:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:23.574 07:31:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1652176 00:06:23.833 07:31:34 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:23.833 07:31:34 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:23.833 07:31:34 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1652176' 00:06:23.833 killing process with pid 1652176 00:06:23.833 07:31:34 -- common/autotest_common.sh@955 -- # kill 1652176 00:06:23.833 07:31:34 -- common/autotest_common.sh@960 -- # wait 1652176 00:06:24.093 00:06:24.093 real 0m3.585s 00:06:24.093 user 0m3.886s 00:06:24.093 sys 0m1.170s 00:06:24.093 07:31:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.093 07:31:34 -- common/autotest_common.sh@10 -- # set +x 00:06:24.093 ************************************ 00:06:24.093 END TEST non_locking_app_on_locked_coremask 00:06:24.093 ************************************ 00:06:24.093 07:31:34 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:24.093 07:31:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:24.093 07:31:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.093 07:31:34 -- common/autotest_common.sh@10 -- # set +x 00:06:24.093 ************************************ 00:06:24.093 START TEST locking_app_on_unlocked_coremask 00:06:24.093 ************************************ 00:06:24.093 07:31:34 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:24.093 07:31:34 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1652750 00:06:24.093 07:31:34 -- event/cpu_locks.sh@99 -- # waitforlisten 1652750 /var/tmp/spdk.sock 00:06:24.093 07:31:34 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:24.093 07:31:34 -- common/autotest_common.sh@829 -- # '[' -z 1652750 ']' 00:06:24.093 07:31:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:24.093 07:31:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:24.093 07:31:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:24.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:24.093 07:31:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:24.093 07:31:34 -- common/autotest_common.sh@10 -- # set +x 00:06:24.093 [2024-11-28 07:31:34.741118] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:24.093 [2024-11-28 07:31:34.741210] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652750 ] 00:06:24.093 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.093 [2024-11-28 07:31:34.806084] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:24.093 [2024-11-28 07:31:34.806121] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.093 [2024-11-28 07:31:34.838437] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.093 [2024-11-28 07:31:34.838560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.030 07:31:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.030 07:31:35 -- common/autotest_common.sh@862 -- # return 0 00:06:25.030 07:31:35 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1652918 00:06:25.030 07:31:35 -- event/cpu_locks.sh@103 -- # waitforlisten 1652918 /var/tmp/spdk2.sock 00:06:25.030 07:31:35 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:25.030 07:31:35 -- common/autotest_common.sh@829 -- # '[' -z 1652918 ']' 00:06:25.030 07:31:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:25.030 07:31:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.030 07:31:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:25.030 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:25.030 07:31:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.030 07:31:35 -- common/autotest_common.sh@10 -- # set +x 00:06:25.030 [2024-11-28 07:31:35.592953] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:25.030 [2024-11-28 07:31:35.593020] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652918 ] 00:06:25.030 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.030 [2024-11-28 07:31:35.683661] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.030 [2024-11-28 07:31:35.756022] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:25.030 [2024-11-28 07:31:35.756145] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.969 07:31:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.969 07:31:36 -- common/autotest_common.sh@862 -- # return 0 00:06:25.969 07:31:36 -- event/cpu_locks.sh@105 -- # locks_exist 1652918 00:06:25.969 07:31:36 -- event/cpu_locks.sh@22 -- # lslocks -p 1652918 00:06:25.969 07:31:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:26.908 lslocks: write error 00:06:26.908 07:31:37 -- event/cpu_locks.sh@107 -- # killprocess 1652750 00:06:26.908 07:31:37 -- common/autotest_common.sh@936 -- # '[' -z 1652750 ']' 00:06:26.908 07:31:37 -- common/autotest_common.sh@940 -- # kill -0 1652750 00:06:26.908 07:31:37 -- common/autotest_common.sh@941 -- # uname 00:06:26.908 07:31:37 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:26.908 07:31:37 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1652750 00:06:27.168 07:31:37 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:27.168 07:31:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:27.168 07:31:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1652750' 00:06:27.168 killing process with pid 1652750 00:06:27.168 07:31:37 -- common/autotest_common.sh@955 -- # kill 1652750 00:06:27.168 07:31:37 -- common/autotest_common.sh@960 -- # wait 1652750 00:06:27.738 07:31:38 -- event/cpu_locks.sh@108 -- # killprocess 1652918 00:06:27.738 07:31:38 -- common/autotest_common.sh@936 -- # '[' -z 1652918 ']' 00:06:27.738 07:31:38 -- common/autotest_common.sh@940 -- # kill -0 1652918 00:06:27.738 07:31:38 -- common/autotest_common.sh@941 -- # uname 00:06:27.738 07:31:38 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:27.738 07:31:38 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1652918 00:06:27.738 07:31:38 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:27.738 07:31:38 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:27.738 07:31:38 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1652918' 00:06:27.738 killing process with pid 1652918 00:06:27.738 07:31:38 -- common/autotest_common.sh@955 -- # kill 1652918 00:06:27.738 07:31:38 -- common/autotest_common.sh@960 -- # wait 1652918 00:06:27.998 00:06:27.998 real 0m3.927s 00:06:27.998 user 0m4.222s 00:06:27.998 sys 0m1.318s 00:06:27.998 07:31:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:27.998 07:31:38 -- common/autotest_common.sh@10 -- # set +x 00:06:27.998 ************************************ 00:06:27.998 END TEST locking_app_on_unlocked_coremask 00:06:27.998 ************************************ 00:06:27.998 07:31:38 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:27.998 07:31:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:27.998 07:31:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.998 07:31:38 -- common/autotest_common.sh@10 -- # set +x 00:06:27.998 ************************************ 00:06:27.998 START TEST locking_app_on_locked_coremask 00:06:27.998 ************************************ 00:06:27.998 07:31:38 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:27.998 07:31:38 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1653535 00:06:27.998 07:31:38 -- event/cpu_locks.sh@116 -- # waitforlisten 1653535 /var/tmp/spdk.sock 00:06:27.998 07:31:38 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:27.998 07:31:38 -- common/autotest_common.sh@829 -- # '[' -z 1653535 ']' 00:06:27.998 07:31:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.998 07:31:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:27.998 07:31:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.998 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.998 07:31:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:27.998 07:31:38 -- common/autotest_common.sh@10 -- # set +x 00:06:27.998 [2024-11-28 07:31:38.719595] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:27.998 [2024-11-28 07:31:38.719696] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653535 ] 00:06:27.998 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.258 [2024-11-28 07:31:38.788341] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.258 [2024-11-28 07:31:38.826074] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:28.258 [2024-11-28 07:31:38.826195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.827 07:31:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:28.827 07:31:39 -- common/autotest_common.sh@862 -- # return 0 00:06:28.827 07:31:39 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1653603 00:06:28.827 07:31:39 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1653603 /var/tmp/spdk2.sock 00:06:28.827 07:31:39 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:28.827 07:31:39 -- common/autotest_common.sh@650 -- # local es=0 00:06:28.827 07:31:39 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1653603 /var/tmp/spdk2.sock 00:06:28.827 07:31:39 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:28.827 07:31:39 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:28.827 07:31:39 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:28.827 07:31:39 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:28.827 07:31:39 -- common/autotest_common.sh@653 -- # waitforlisten 1653603 /var/tmp/spdk2.sock 00:06:28.827 07:31:39 -- common/autotest_common.sh@829 -- # '[' -z 1653603 ']' 00:06:28.827 07:31:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:28.827 07:31:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:28.827 07:31:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:28.827 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:28.827 07:31:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:28.827 07:31:39 -- common/autotest_common.sh@10 -- # set +x 00:06:28.827 [2024-11-28 07:31:39.561752] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:28.827 [2024-11-28 07:31:39.561820] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653603 ] 00:06:28.827 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.087 [2024-11-28 07:31:39.647886] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1653535 has claimed it. 00:06:29.087 [2024-11-28 07:31:39.647922] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:29.656 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1653603) - No such process 00:06:29.656 ERROR: process (pid: 1653603) is no longer running 00:06:29.656 07:31:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.656 07:31:40 -- common/autotest_common.sh@862 -- # return 1 00:06:29.656 07:31:40 -- common/autotest_common.sh@653 -- # es=1 00:06:29.656 07:31:40 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:29.656 07:31:40 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:29.656 07:31:40 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:29.656 07:31:40 -- event/cpu_locks.sh@122 -- # locks_exist 1653535 00:06:29.656 07:31:40 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:29.656 07:31:40 -- event/cpu_locks.sh@22 -- # lslocks -p 1653535 00:06:30.227 lslocks: write error 00:06:30.227 07:31:40 -- event/cpu_locks.sh@124 -- # killprocess 1653535 00:06:30.227 07:31:40 -- common/autotest_common.sh@936 -- # '[' -z 1653535 ']' 00:06:30.227 07:31:40 -- common/autotest_common.sh@940 -- # kill -0 1653535 00:06:30.227 07:31:40 -- common/autotest_common.sh@941 -- # uname 00:06:30.227 07:31:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:30.227 07:31:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1653535 00:06:30.227 07:31:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:30.227 07:31:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:30.227 07:31:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1653535' 00:06:30.227 killing process with pid 1653535 00:06:30.227 07:31:40 -- common/autotest_common.sh@955 -- # kill 1653535 00:06:30.227 07:31:40 -- common/autotest_common.sh@960 -- # wait 1653535 00:06:30.486 00:06:30.486 real 0m2.398s 00:06:30.486 user 0m2.619s 00:06:30.486 sys 0m0.736s 00:06:30.486 07:31:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:30.486 07:31:41 -- common/autotest_common.sh@10 -- # set +x 00:06:30.486 ************************************ 00:06:30.486 END TEST locking_app_on_locked_coremask 00:06:30.486 ************************************ 00:06:30.486 07:31:41 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:30.486 07:31:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:30.486 07:31:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:30.486 07:31:41 -- common/autotest_common.sh@10 -- # set +x 00:06:30.486 ************************************ 00:06:30.486 START TEST locking_overlapped_coremask 00:06:30.486 ************************************ 00:06:30.486 07:31:41 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:30.486 07:31:41 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1653906 00:06:30.486 07:31:41 -- event/cpu_locks.sh@133 -- # waitforlisten 1653906 /var/tmp/spdk.sock 00:06:30.486 07:31:41 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:30.486 07:31:41 -- common/autotest_common.sh@829 -- # '[' -z 1653906 ']' 00:06:30.486 07:31:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.486 07:31:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.486 07:31:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.486 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.486 07:31:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.486 07:31:41 -- common/autotest_common.sh@10 -- # set +x 00:06:30.486 [2024-11-28 07:31:41.167468] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:30.486 [2024-11-28 07:31:41.167536] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653906 ] 00:06:30.486 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.486 [2024-11-28 07:31:41.234831] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:30.745 [2024-11-28 07:31:41.268948] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.745 [2024-11-28 07:31:41.269147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:30.745 [2024-11-28 07:31:41.269168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:30.745 [2024-11-28 07:31:41.269170] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.315 07:31:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.315 07:31:41 -- common/autotest_common.sh@862 -- # return 0 00:06:31.315 07:31:41 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1654175 00:06:31.315 07:31:41 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1654175 /var/tmp/spdk2.sock 00:06:31.315 07:31:41 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:31.315 07:31:41 -- common/autotest_common.sh@650 -- # local es=0 00:06:31.315 07:31:41 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1654175 /var/tmp/spdk2.sock 00:06:31.315 07:31:41 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:31.315 07:31:41 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.315 07:31:41 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:31.315 07:31:41 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.315 07:31:41 -- common/autotest_common.sh@653 -- # waitforlisten 1654175 /var/tmp/spdk2.sock 00:06:31.315 07:31:41 -- common/autotest_common.sh@829 -- # '[' -z 1654175 ']' 00:06:31.315 07:31:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:31.315 07:31:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.315 07:31:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:31.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:31.315 07:31:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.315 07:31:42 -- common/autotest_common.sh@10 -- # set +x 00:06:31.315 [2024-11-28 07:31:42.018233] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:31.315 [2024-11-28 07:31:42.018324] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654175 ] 00:06:31.315 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.574 [2024-11-28 07:31:42.111368] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1653906 has claimed it. 00:06:31.574 [2024-11-28 07:31:42.111407] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:32.143 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1654175) - No such process 00:06:32.143 ERROR: process (pid: 1654175) is no longer running 00:06:32.143 07:31:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:32.143 07:31:42 -- common/autotest_common.sh@862 -- # return 1 00:06:32.143 07:31:42 -- common/autotest_common.sh@653 -- # es=1 00:06:32.143 07:31:42 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:32.143 07:31:42 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:32.143 07:31:42 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:32.143 07:31:42 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:32.143 07:31:42 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:32.144 07:31:42 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:32.144 07:31:42 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:32.144 07:31:42 -- event/cpu_locks.sh@141 -- # killprocess 1653906 00:06:32.144 07:31:42 -- common/autotest_common.sh@936 -- # '[' -z 1653906 ']' 00:06:32.144 07:31:42 -- common/autotest_common.sh@940 -- # kill -0 1653906 00:06:32.144 07:31:42 -- common/autotest_common.sh@941 -- # uname 00:06:32.144 07:31:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:32.144 07:31:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1653906 00:06:32.144 07:31:42 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:32.144 07:31:42 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:32.144 07:31:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1653906' 00:06:32.144 killing process with pid 1653906 00:06:32.144 07:31:42 -- common/autotest_common.sh@955 -- # kill 1653906 00:06:32.144 07:31:42 -- common/autotest_common.sh@960 -- # wait 1653906 00:06:32.403 00:06:32.403 real 0m1.879s 00:06:32.403 user 0m5.438s 00:06:32.403 sys 0m0.414s 00:06:32.403 07:31:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.403 07:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.403 ************************************ 00:06:32.403 END TEST locking_overlapped_coremask 00:06:32.403 ************************************ 00:06:32.403 07:31:43 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:32.403 07:31:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:32.403 07:31:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.403 07:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.403 ************************************ 00:06:32.403 START TEST locking_overlapped_coremask_via_rpc 00:06:32.403 ************************************ 00:06:32.403 07:31:43 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:32.403 07:31:43 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1654309 00:06:32.403 07:31:43 -- event/cpu_locks.sh@149 -- # waitforlisten 1654309 /var/tmp/spdk.sock 00:06:32.403 07:31:43 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:32.403 07:31:43 -- common/autotest_common.sh@829 -- # '[' -z 1654309 ']' 00:06:32.403 07:31:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:32.403 07:31:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:32.403 07:31:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:32.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:32.403 07:31:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:32.403 07:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:32.403 [2024-11-28 07:31:43.097867] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:32.403 [2024-11-28 07:31:43.097956] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654309 ] 00:06:32.404 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.404 [2024-11-28 07:31:43.167201] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:32.404 [2024-11-28 07:31:43.167236] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:32.663 [2024-11-28 07:31:43.203696] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:32.663 [2024-11-28 07:31:43.203853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.663 [2024-11-28 07:31:43.203949] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:32.663 [2024-11-28 07:31:43.203951] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.231 07:31:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:33.231 07:31:43 -- common/autotest_common.sh@862 -- # return 0 00:06:33.231 07:31:43 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:33.231 07:31:43 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1654483 00:06:33.231 07:31:43 -- event/cpu_locks.sh@153 -- # waitforlisten 1654483 /var/tmp/spdk2.sock 00:06:33.231 07:31:43 -- common/autotest_common.sh@829 -- # '[' -z 1654483 ']' 00:06:33.231 07:31:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:33.231 07:31:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:33.231 07:31:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:33.231 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:33.231 07:31:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:33.231 07:31:43 -- common/autotest_common.sh@10 -- # set +x 00:06:33.231 [2024-11-28 07:31:43.948830] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:33.231 [2024-11-28 07:31:43.948916] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654483 ] 00:06:33.231 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.491 [2024-11-28 07:31:44.042268] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:33.491 [2024-11-28 07:31:44.042300] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:33.491 [2024-11-28 07:31:44.116135] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:33.491 [2024-11-28 07:31:44.116298] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:33.491 [2024-11-28 07:31:44.119646] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:33.491 [2024-11-28 07:31:44.119648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:34.060 07:31:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:34.060 07:31:44 -- common/autotest_common.sh@862 -- # return 0 00:06:34.060 07:31:44 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:34.060 07:31:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.060 07:31:44 -- common/autotest_common.sh@10 -- # set +x 00:06:34.060 07:31:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.060 07:31:44 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:34.060 07:31:44 -- common/autotest_common.sh@650 -- # local es=0 00:06:34.060 07:31:44 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:34.060 07:31:44 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:34.060 07:31:44 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:34.060 07:31:44 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:34.060 07:31:44 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:34.060 07:31:44 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:34.060 07:31:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.060 07:31:44 -- common/autotest_common.sh@10 -- # set +x 00:06:34.060 [2024-11-28 07:31:44.796663] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1654309 has claimed it. 00:06:34.060 request: 00:06:34.060 { 00:06:34.060 "method": "framework_enable_cpumask_locks", 00:06:34.060 "req_id": 1 00:06:34.060 } 00:06:34.060 Got JSON-RPC error response 00:06:34.060 response: 00:06:34.060 { 00:06:34.060 "code": -32603, 00:06:34.060 "message": "Failed to claim CPU core: 2" 00:06:34.060 } 00:06:34.060 07:31:44 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:34.060 07:31:44 -- common/autotest_common.sh@653 -- # es=1 00:06:34.060 07:31:44 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:34.060 07:31:44 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:34.060 07:31:44 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:34.060 07:31:44 -- event/cpu_locks.sh@158 -- # waitforlisten 1654309 /var/tmp/spdk.sock 00:06:34.060 07:31:44 -- common/autotest_common.sh@829 -- # '[' -z 1654309 ']' 00:06:34.060 07:31:44 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.060 07:31:44 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.060 07:31:44 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.060 07:31:44 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.060 07:31:44 -- common/autotest_common.sh@10 -- # set +x 00:06:34.319 07:31:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:34.319 07:31:45 -- common/autotest_common.sh@862 -- # return 0 00:06:34.319 07:31:45 -- event/cpu_locks.sh@159 -- # waitforlisten 1654483 /var/tmp/spdk2.sock 00:06:34.319 07:31:45 -- common/autotest_common.sh@829 -- # '[' -z 1654483 ']' 00:06:34.319 07:31:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:34.319 07:31:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.319 07:31:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:34.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:34.319 07:31:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.319 07:31:45 -- common/autotest_common.sh@10 -- # set +x 00:06:34.577 07:31:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:34.577 07:31:45 -- common/autotest_common.sh@862 -- # return 0 00:06:34.577 07:31:45 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:34.577 07:31:45 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:34.577 07:31:45 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:34.577 07:31:45 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:34.577 00:06:34.577 real 0m2.126s 00:06:34.577 user 0m0.877s 00:06:34.578 sys 0m0.183s 00:06:34.578 07:31:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.578 07:31:45 -- common/autotest_common.sh@10 -- # set +x 00:06:34.578 ************************************ 00:06:34.578 END TEST locking_overlapped_coremask_via_rpc 00:06:34.578 ************************************ 00:06:34.578 07:31:45 -- event/cpu_locks.sh@174 -- # cleanup 00:06:34.578 07:31:45 -- event/cpu_locks.sh@15 -- # [[ -z 1654309 ]] 00:06:34.578 07:31:45 -- event/cpu_locks.sh@15 -- # killprocess 1654309 00:06:34.578 07:31:45 -- common/autotest_common.sh@936 -- # '[' -z 1654309 ']' 00:06:34.578 07:31:45 -- common/autotest_common.sh@940 -- # kill -0 1654309 00:06:34.578 07:31:45 -- common/autotest_common.sh@941 -- # uname 00:06:34.578 07:31:45 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:34.578 07:31:45 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1654309 00:06:34.578 07:31:45 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:34.578 07:31:45 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:34.578 07:31:45 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1654309' 00:06:34.578 killing process with pid 1654309 00:06:34.578 07:31:45 -- common/autotest_common.sh@955 -- # kill 1654309 00:06:34.578 07:31:45 -- common/autotest_common.sh@960 -- # wait 1654309 00:06:35.145 07:31:45 -- event/cpu_locks.sh@16 -- # [[ -z 1654483 ]] 00:06:35.145 07:31:45 -- event/cpu_locks.sh@16 -- # killprocess 1654483 00:06:35.145 07:31:45 -- common/autotest_common.sh@936 -- # '[' -z 1654483 ']' 00:06:35.145 07:31:45 -- common/autotest_common.sh@940 -- # kill -0 1654483 00:06:35.145 07:31:45 -- common/autotest_common.sh@941 -- # uname 00:06:35.145 07:31:45 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:35.145 07:31:45 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1654483 00:06:35.145 07:31:45 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:35.145 07:31:45 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:35.145 07:31:45 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1654483' 00:06:35.145 killing process with pid 1654483 00:06:35.145 07:31:45 -- common/autotest_common.sh@955 -- # kill 1654483 00:06:35.145 07:31:45 -- common/autotest_common.sh@960 -- # wait 1654483 00:06:35.405 07:31:45 -- event/cpu_locks.sh@18 -- # rm -f 00:06:35.405 07:31:45 -- event/cpu_locks.sh@1 -- # cleanup 00:06:35.405 07:31:45 -- event/cpu_locks.sh@15 -- # [[ -z 1654309 ]] 00:06:35.405 07:31:45 -- event/cpu_locks.sh@15 -- # killprocess 1654309 00:06:35.405 07:31:45 -- common/autotest_common.sh@936 -- # '[' -z 1654309 ']' 00:06:35.405 07:31:45 -- common/autotest_common.sh@940 -- # kill -0 1654309 00:06:35.405 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1654309) - No such process 00:06:35.405 07:31:45 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1654309 is not found' 00:06:35.405 Process with pid 1654309 is not found 00:06:35.405 07:31:45 -- event/cpu_locks.sh@16 -- # [[ -z 1654483 ]] 00:06:35.405 07:31:45 -- event/cpu_locks.sh@16 -- # killprocess 1654483 00:06:35.405 07:31:45 -- common/autotest_common.sh@936 -- # '[' -z 1654483 ']' 00:06:35.405 07:31:45 -- common/autotest_common.sh@940 -- # kill -0 1654483 00:06:35.405 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1654483) - No such process 00:06:35.405 07:31:45 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1654483 is not found' 00:06:35.405 Process with pid 1654483 is not found 00:06:35.405 07:31:45 -- event/cpu_locks.sh@18 -- # rm -f 00:06:35.405 00:06:35.405 real 0m18.475s 00:06:35.405 user 0m31.522s 00:06:35.405 sys 0m5.973s 00:06:35.405 07:31:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.405 07:31:45 -- common/autotest_common.sh@10 -- # set +x 00:06:35.405 ************************************ 00:06:35.405 END TEST cpu_locks 00:06:35.405 ************************************ 00:06:35.405 00:06:35.405 real 0m43.337s 00:06:35.405 user 1m20.976s 00:06:35.405 sys 0m10.011s 00:06:35.405 07:31:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.405 07:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.405 ************************************ 00:06:35.405 END TEST event 00:06:35.405 ************************************ 00:06:35.405 07:31:46 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:35.405 07:31:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:35.405 07:31:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.405 07:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.405 ************************************ 00:06:35.405 START TEST thread 00:06:35.405 ************************************ 00:06:35.405 07:31:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:35.405 * Looking for test storage... 00:06:35.405 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:35.405 07:31:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:35.405 07:31:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:35.405 07:31:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:35.665 07:31:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:35.665 07:31:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:35.665 07:31:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:35.665 07:31:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:35.665 07:31:46 -- scripts/common.sh@335 -- # IFS=.-: 00:06:35.665 07:31:46 -- scripts/common.sh@335 -- # read -ra ver1 00:06:35.665 07:31:46 -- scripts/common.sh@336 -- # IFS=.-: 00:06:35.665 07:31:46 -- scripts/common.sh@336 -- # read -ra ver2 00:06:35.665 07:31:46 -- scripts/common.sh@337 -- # local 'op=<' 00:06:35.665 07:31:46 -- scripts/common.sh@339 -- # ver1_l=2 00:06:35.665 07:31:46 -- scripts/common.sh@340 -- # ver2_l=1 00:06:35.665 07:31:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:35.665 07:31:46 -- scripts/common.sh@343 -- # case "$op" in 00:06:35.665 07:31:46 -- scripts/common.sh@344 -- # : 1 00:06:35.665 07:31:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:35.665 07:31:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:35.665 07:31:46 -- scripts/common.sh@364 -- # decimal 1 00:06:35.665 07:31:46 -- scripts/common.sh@352 -- # local d=1 00:06:35.665 07:31:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:35.665 07:31:46 -- scripts/common.sh@354 -- # echo 1 00:06:35.665 07:31:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:35.665 07:31:46 -- scripts/common.sh@365 -- # decimal 2 00:06:35.665 07:31:46 -- scripts/common.sh@352 -- # local d=2 00:06:35.665 07:31:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:35.665 07:31:46 -- scripts/common.sh@354 -- # echo 2 00:06:35.665 07:31:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:35.665 07:31:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:35.665 07:31:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:35.665 07:31:46 -- scripts/common.sh@367 -- # return 0 00:06:35.665 07:31:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:35.665 07:31:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:35.665 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.665 --rc genhtml_branch_coverage=1 00:06:35.665 --rc genhtml_function_coverage=1 00:06:35.665 --rc genhtml_legend=1 00:06:35.665 --rc geninfo_all_blocks=1 00:06:35.665 --rc geninfo_unexecuted_blocks=1 00:06:35.665 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.665 ' 00:06:35.665 07:31:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:35.665 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.665 --rc genhtml_branch_coverage=1 00:06:35.665 --rc genhtml_function_coverage=1 00:06:35.665 --rc genhtml_legend=1 00:06:35.665 --rc geninfo_all_blocks=1 00:06:35.665 --rc geninfo_unexecuted_blocks=1 00:06:35.665 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.665 ' 00:06:35.665 07:31:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:35.665 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.665 --rc genhtml_branch_coverage=1 00:06:35.665 --rc genhtml_function_coverage=1 00:06:35.665 --rc genhtml_legend=1 00:06:35.665 --rc geninfo_all_blocks=1 00:06:35.665 --rc geninfo_unexecuted_blocks=1 00:06:35.665 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.665 ' 00:06:35.665 07:31:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:35.665 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.665 --rc genhtml_branch_coverage=1 00:06:35.665 --rc genhtml_function_coverage=1 00:06:35.665 --rc genhtml_legend=1 00:06:35.665 --rc geninfo_all_blocks=1 00:06:35.665 --rc geninfo_unexecuted_blocks=1 00:06:35.665 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.665 ' 00:06:35.665 07:31:46 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:35.665 07:31:46 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:35.665 07:31:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.665 07:31:46 -- common/autotest_common.sh@10 -- # set +x 00:06:35.665 ************************************ 00:06:35.665 START TEST thread_poller_perf 00:06:35.665 ************************************ 00:06:35.665 07:31:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:35.665 [2024-11-28 07:31:46.277622] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:35.665 [2024-11-28 07:31:46.277720] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655013 ] 00:06:35.665 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.665 [2024-11-28 07:31:46.347533] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.665 [2024-11-28 07:31:46.384276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.665 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:37.044 [2024-11-28T06:31:47.814Z] ====================================== 00:06:37.044 [2024-11-28T06:31:47.814Z] busy:2506071234 (cyc) 00:06:37.044 [2024-11-28T06:31:47.814Z] total_run_count: 796000 00:06:37.044 [2024-11-28T06:31:47.814Z] tsc_hz: 2500000000 (cyc) 00:06:37.044 [2024-11-28T06:31:47.814Z] ====================================== 00:06:37.044 [2024-11-28T06:31:47.814Z] poller_cost: 3148 (cyc), 1259 (nsec) 00:06:37.044 00:06:37.044 real 0m1.179s 00:06:37.044 user 0m1.083s 00:06:37.044 sys 0m0.091s 00:06:37.044 07:31:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.044 07:31:47 -- common/autotest_common.sh@10 -- # set +x 00:06:37.044 ************************************ 00:06:37.044 END TEST thread_poller_perf 00:06:37.044 ************************************ 00:06:37.044 07:31:47 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:37.044 07:31:47 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:37.044 07:31:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.044 07:31:47 -- common/autotest_common.sh@10 -- # set +x 00:06:37.044 ************************************ 00:06:37.044 START TEST thread_poller_perf 00:06:37.044 ************************************ 00:06:37.044 07:31:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:37.044 [2024-11-28 07:31:47.502881] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:37.044 [2024-11-28 07:31:47.502996] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655159 ] 00:06:37.044 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.044 [2024-11-28 07:31:47.574154] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.044 [2024-11-28 07:31:47.609169] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.044 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:37.982 [2024-11-28T06:31:48.752Z] ====================================== 00:06:37.982 [2024-11-28T06:31:48.752Z] busy:2501949320 (cyc) 00:06:37.982 [2024-11-28T06:31:48.752Z] total_run_count: 13261000 00:06:37.982 [2024-11-28T06:31:48.752Z] tsc_hz: 2500000000 (cyc) 00:06:37.982 [2024-11-28T06:31:48.752Z] ====================================== 00:06:37.982 [2024-11-28T06:31:48.752Z] poller_cost: 188 (cyc), 75 (nsec) 00:06:37.982 00:06:37.982 real 0m1.175s 00:06:37.982 user 0m1.082s 00:06:37.982 sys 0m0.088s 00:06:37.982 07:31:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.982 07:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.982 ************************************ 00:06:37.982 END TEST thread_poller_perf 00:06:37.982 ************************************ 00:06:37.982 07:31:48 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:37.982 07:31:48 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:37.982 07:31:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:37.982 07:31:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.982 07:31:48 -- common/autotest_common.sh@10 -- # set +x 00:06:37.982 ************************************ 00:06:37.982 START TEST thread_spdk_lock 00:06:37.982 ************************************ 00:06:37.982 07:31:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:37.982 [2024-11-28 07:31:48.726429] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:37.982 [2024-11-28 07:31:48.726543] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655432 ] 00:06:38.242 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.242 [2024-11-28 07:31:48.796700] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:38.242 [2024-11-28 07:31:48.833028] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:38.242 [2024-11-28 07:31:48.833033] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.810 [2024-11-28 07:31:49.323964] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:38.810 [2024-11-28 07:31:49.324000] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:38.810 [2024-11-28 07:31:49.324010] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:06:38.810 [2024-11-28 07:31:49.324804] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:38.810 [2024-11-28 07:31:49.324907] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:38.810 [2024-11-28 07:31:49.324926] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:38.810 Starting test contend 00:06:38.810 Worker Delay Wait us Hold us Total us 00:06:38.810 0 3 170166 183800 353966 00:06:38.810 1 5 89197 285811 375009 00:06:38.810 PASS test contend 00:06:38.810 Starting test hold_by_poller 00:06:38.810 PASS test hold_by_poller 00:06:38.810 Starting test hold_by_message 00:06:38.810 PASS test hold_by_message 00:06:38.810 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:38.810 100014 assertions passed 00:06:38.810 0 assertions failed 00:06:38.810 00:06:38.810 real 0m0.664s 00:06:38.810 user 0m1.056s 00:06:38.810 sys 0m0.096s 00:06:38.810 07:31:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.810 07:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.810 ************************************ 00:06:38.810 END TEST thread_spdk_lock 00:06:38.810 ************************************ 00:06:38.810 00:06:38.810 real 0m3.346s 00:06:38.810 user 0m3.367s 00:06:38.810 sys 0m0.500s 00:06:38.810 07:31:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.810 07:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.810 ************************************ 00:06:38.810 END TEST thread 00:06:38.810 ************************************ 00:06:38.810 07:31:49 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:38.810 07:31:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:38.810 07:31:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.810 07:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:38.810 ************************************ 00:06:38.810 START TEST accel 00:06:38.810 ************************************ 00:06:38.810 07:31:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:38.810 * Looking for test storage... 00:06:38.810 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:38.810 07:31:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:38.810 07:31:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:38.810 07:31:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:39.069 07:31:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:39.069 07:31:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:39.069 07:31:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:39.069 07:31:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:39.069 07:31:49 -- scripts/common.sh@335 -- # IFS=.-: 00:06:39.069 07:31:49 -- scripts/common.sh@335 -- # read -ra ver1 00:06:39.069 07:31:49 -- scripts/common.sh@336 -- # IFS=.-: 00:06:39.069 07:31:49 -- scripts/common.sh@336 -- # read -ra ver2 00:06:39.069 07:31:49 -- scripts/common.sh@337 -- # local 'op=<' 00:06:39.069 07:31:49 -- scripts/common.sh@339 -- # ver1_l=2 00:06:39.069 07:31:49 -- scripts/common.sh@340 -- # ver2_l=1 00:06:39.069 07:31:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:39.069 07:31:49 -- scripts/common.sh@343 -- # case "$op" in 00:06:39.069 07:31:49 -- scripts/common.sh@344 -- # : 1 00:06:39.069 07:31:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:39.069 07:31:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:39.069 07:31:49 -- scripts/common.sh@364 -- # decimal 1 00:06:39.069 07:31:49 -- scripts/common.sh@352 -- # local d=1 00:06:39.069 07:31:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:39.069 07:31:49 -- scripts/common.sh@354 -- # echo 1 00:06:39.069 07:31:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:39.069 07:31:49 -- scripts/common.sh@365 -- # decimal 2 00:06:39.069 07:31:49 -- scripts/common.sh@352 -- # local d=2 00:06:39.069 07:31:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:39.069 07:31:49 -- scripts/common.sh@354 -- # echo 2 00:06:39.069 07:31:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:39.069 07:31:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:39.069 07:31:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:39.069 07:31:49 -- scripts/common.sh@367 -- # return 0 00:06:39.069 07:31:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:39.069 07:31:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:39.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.069 --rc genhtml_branch_coverage=1 00:06:39.069 --rc genhtml_function_coverage=1 00:06:39.069 --rc genhtml_legend=1 00:06:39.069 --rc geninfo_all_blocks=1 00:06:39.069 --rc geninfo_unexecuted_blocks=1 00:06:39.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.069 ' 00:06:39.069 07:31:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:39.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.069 --rc genhtml_branch_coverage=1 00:06:39.069 --rc genhtml_function_coverage=1 00:06:39.069 --rc genhtml_legend=1 00:06:39.069 --rc geninfo_all_blocks=1 00:06:39.069 --rc geninfo_unexecuted_blocks=1 00:06:39.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.069 ' 00:06:39.069 07:31:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:39.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.069 --rc genhtml_branch_coverage=1 00:06:39.069 --rc genhtml_function_coverage=1 00:06:39.069 --rc genhtml_legend=1 00:06:39.069 --rc geninfo_all_blocks=1 00:06:39.069 --rc geninfo_unexecuted_blocks=1 00:06:39.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.069 ' 00:06:39.069 07:31:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:39.069 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.069 --rc genhtml_branch_coverage=1 00:06:39.069 --rc genhtml_function_coverage=1 00:06:39.069 --rc genhtml_legend=1 00:06:39.069 --rc geninfo_all_blocks=1 00:06:39.069 --rc geninfo_unexecuted_blocks=1 00:06:39.069 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.069 ' 00:06:39.069 07:31:49 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:39.069 07:31:49 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:39.069 07:31:49 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:39.069 07:31:49 -- accel/accel.sh@59 -- # spdk_tgt_pid=1655759 00:06:39.069 07:31:49 -- accel/accel.sh@60 -- # waitforlisten 1655759 00:06:39.069 07:31:49 -- common/autotest_common.sh@829 -- # '[' -z 1655759 ']' 00:06:39.069 07:31:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:39.069 07:31:49 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:39.069 07:31:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:39.069 07:31:49 -- accel/accel.sh@58 -- # build_accel_config 00:06:39.069 07:31:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:39.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:39.069 07:31:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.069 07:31:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:39.069 07:31:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.069 07:31:49 -- common/autotest_common.sh@10 -- # set +x 00:06:39.069 07:31:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.069 07:31:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.069 07:31:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.069 07:31:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.069 07:31:49 -- accel/accel.sh@42 -- # jq -r . 00:06:39.069 [2024-11-28 07:31:49.670910] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:39.069 [2024-11-28 07:31:49.670984] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655759 ] 00:06:39.069 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.069 [2024-11-28 07:31:49.737301] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.070 [2024-11-28 07:31:49.773178] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:39.070 [2024-11-28 07:31:49.773301] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.005 07:31:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:40.005 07:31:50 -- common/autotest_common.sh@862 -- # return 0 00:06:40.005 07:31:50 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:40.005 07:31:50 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:40.005 07:31:50 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:40.005 07:31:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:40.005 07:31:50 -- common/autotest_common.sh@10 -- # set +x 00:06:40.005 07:31:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.005 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.005 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.005 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.006 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.006 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.006 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.006 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.006 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.006 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.006 07:31:50 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:40.006 07:31:50 -- accel/accel.sh@64 -- # IFS== 00:06:40.006 07:31:50 -- accel/accel.sh@64 -- # read -r opc module 00:06:40.006 07:31:50 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:40.006 07:31:50 -- accel/accel.sh@67 -- # killprocess 1655759 00:06:40.006 07:31:50 -- common/autotest_common.sh@936 -- # '[' -z 1655759 ']' 00:06:40.006 07:31:50 -- common/autotest_common.sh@940 -- # kill -0 1655759 00:06:40.006 07:31:50 -- common/autotest_common.sh@941 -- # uname 00:06:40.006 07:31:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:40.006 07:31:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1655759 00:06:40.006 07:31:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:40.006 07:31:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:40.006 07:31:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1655759' 00:06:40.006 killing process with pid 1655759 00:06:40.006 07:31:50 -- common/autotest_common.sh@955 -- # kill 1655759 00:06:40.006 07:31:50 -- common/autotest_common.sh@960 -- # wait 1655759 00:06:40.265 07:31:50 -- accel/accel.sh@68 -- # trap - ERR 00:06:40.265 07:31:50 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:40.265 07:31:50 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:40.265 07:31:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.265 07:31:50 -- common/autotest_common.sh@10 -- # set +x 00:06:40.265 07:31:50 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:40.265 07:31:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:40.265 07:31:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.265 07:31:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.265 07:31:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.265 07:31:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.265 07:31:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.265 07:31:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.265 07:31:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.265 07:31:50 -- accel/accel.sh@42 -- # jq -r . 00:06:40.265 07:31:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:40.265 07:31:50 -- common/autotest_common.sh@10 -- # set +x 00:06:40.265 07:31:50 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:40.265 07:31:50 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:40.265 07:31:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.265 07:31:50 -- common/autotest_common.sh@10 -- # set +x 00:06:40.265 ************************************ 00:06:40.265 START TEST accel_missing_filename 00:06:40.265 ************************************ 00:06:40.265 07:31:50 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:40.265 07:31:50 -- common/autotest_common.sh@650 -- # local es=0 00:06:40.265 07:31:50 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:40.265 07:31:50 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:40.265 07:31:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:40.265 07:31:50 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:40.265 07:31:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:40.265 07:31:50 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:40.265 07:31:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:40.265 07:31:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.265 07:31:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.265 07:31:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.265 07:31:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.265 07:31:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.265 07:31:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.265 07:31:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.265 07:31:50 -- accel/accel.sh@42 -- # jq -r . 00:06:40.265 [2024-11-28 07:31:50.991892] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:40.265 [2024-11-28 07:31:50.992002] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656024 ] 00:06:40.265 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.524 [2024-11-28 07:31:51.063271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.524 [2024-11-28 07:31:51.099317] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.524 [2024-11-28 07:31:51.139054] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:40.524 [2024-11-28 07:31:51.199203] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:40.524 A filename is required. 00:06:40.524 07:31:51 -- common/autotest_common.sh@653 -- # es=234 00:06:40.524 07:31:51 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:40.524 07:31:51 -- common/autotest_common.sh@662 -- # es=106 00:06:40.524 07:31:51 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:40.524 07:31:51 -- common/autotest_common.sh@670 -- # es=1 00:06:40.524 07:31:51 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:40.524 00:06:40.524 real 0m0.287s 00:06:40.524 user 0m0.193s 00:06:40.524 sys 0m0.131s 00:06:40.524 07:31:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:40.524 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:40.524 ************************************ 00:06:40.524 END TEST accel_missing_filename 00:06:40.524 ************************************ 00:06:40.783 07:31:51 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:40.783 07:31:51 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:40.783 07:31:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.783 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:40.783 ************************************ 00:06:40.783 START TEST accel_compress_verify 00:06:40.783 ************************************ 00:06:40.783 07:31:51 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:40.783 07:31:51 -- common/autotest_common.sh@650 -- # local es=0 00:06:40.783 07:31:51 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:40.783 07:31:51 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:40.783 07:31:51 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:40.783 07:31:51 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:40.783 07:31:51 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:40.783 07:31:51 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:40.783 07:31:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:40.783 07:31:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.783 07:31:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.783 07:31:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.783 07:31:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.783 07:31:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.784 07:31:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.784 07:31:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.784 07:31:51 -- accel/accel.sh@42 -- # jq -r . 00:06:40.784 [2024-11-28 07:31:51.327365] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:40.784 [2024-11-28 07:31:51.327476] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656089 ] 00:06:40.784 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.784 [2024-11-28 07:31:51.396871] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.784 [2024-11-28 07:31:51.432401] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.784 [2024-11-28 07:31:51.472286] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:40.784 [2024-11-28 07:31:51.531995] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:41.043 00:06:41.044 Compression does not support the verify option, aborting. 00:06:41.044 07:31:51 -- common/autotest_common.sh@653 -- # es=161 00:06:41.044 07:31:51 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:41.044 07:31:51 -- common/autotest_common.sh@662 -- # es=33 00:06:41.044 07:31:51 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:41.044 07:31:51 -- common/autotest_common.sh@670 -- # es=1 00:06:41.044 07:31:51 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:41.044 00:06:41.044 real 0m0.286s 00:06:41.044 user 0m0.192s 00:06:41.044 sys 0m0.132s 00:06:41.044 07:31:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.044 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 ************************************ 00:06:41.044 END TEST accel_compress_verify 00:06:41.044 ************************************ 00:06:41.044 07:31:51 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:41.044 07:31:51 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:41.044 07:31:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.044 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 ************************************ 00:06:41.044 START TEST accel_wrong_workload 00:06:41.044 ************************************ 00:06:41.044 07:31:51 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:41.044 07:31:51 -- common/autotest_common.sh@650 -- # local es=0 00:06:41.044 07:31:51 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:41.044 07:31:51 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:41.044 07:31:51 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:41.044 07:31:51 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:41.044 07:31:51 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:41.044 07:31:51 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:41.044 07:31:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:41.044 07:31:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.044 07:31:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.044 07:31:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.044 07:31:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.044 07:31:51 -- accel/accel.sh@42 -- # jq -r . 00:06:41.044 Unsupported workload type: foobar 00:06:41.044 [2024-11-28 07:31:51.660216] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:41.044 accel_perf options: 00:06:41.044 [-h help message] 00:06:41.044 [-q queue depth per core] 00:06:41.044 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:41.044 [-T number of threads per core 00:06:41.044 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:41.044 [-t time in seconds] 00:06:41.044 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:41.044 [ dif_verify, , dif_generate, dif_generate_copy 00:06:41.044 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:41.044 [-l for compress/decompress workloads, name of uncompressed input file 00:06:41.044 [-S for crc32c workload, use this seed value (default 0) 00:06:41.044 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:41.044 [-f for fill workload, use this BYTE value (default 255) 00:06:41.044 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:41.044 [-y verify result if this switch is on] 00:06:41.044 [-a tasks to allocate per core (default: same value as -q)] 00:06:41.044 Can be used to spread operations across a wider range of memory. 00:06:41.044 07:31:51 -- common/autotest_common.sh@653 -- # es=1 00:06:41.044 07:31:51 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:41.044 07:31:51 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:41.044 07:31:51 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:41.044 00:06:41.044 real 0m0.028s 00:06:41.044 user 0m0.012s 00:06:41.044 sys 0m0.016s 00:06:41.044 07:31:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.044 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 ************************************ 00:06:41.044 END TEST accel_wrong_workload 00:06:41.044 ************************************ 00:06:41.044 Error: writing output failed: Broken pipe 00:06:41.044 07:31:51 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:41.044 07:31:51 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:41.044 07:31:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.044 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 ************************************ 00:06:41.044 START TEST accel_negative_buffers 00:06:41.044 ************************************ 00:06:41.044 07:31:51 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:41.044 07:31:51 -- common/autotest_common.sh@650 -- # local es=0 00:06:41.044 07:31:51 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:41.044 07:31:51 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:41.044 07:31:51 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:41.044 07:31:51 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:41.044 07:31:51 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:41.044 07:31:51 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:41.044 07:31:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:41.044 07:31:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.044 07:31:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.044 07:31:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.044 07:31:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.044 07:31:51 -- accel/accel.sh@42 -- # jq -r . 00:06:41.044 -x option must be non-negative. 00:06:41.044 [2024-11-28 07:31:51.735473] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:41.044 accel_perf options: 00:06:41.044 [-h help message] 00:06:41.044 [-q queue depth per core] 00:06:41.044 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:41.044 [-T number of threads per core 00:06:41.044 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:41.044 [-t time in seconds] 00:06:41.044 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:41.044 [ dif_verify, , dif_generate, dif_generate_copy 00:06:41.044 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:41.044 [-l for compress/decompress workloads, name of uncompressed input file 00:06:41.044 [-S for crc32c workload, use this seed value (default 0) 00:06:41.044 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:41.044 [-f for fill workload, use this BYTE value (default 255) 00:06:41.044 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:41.044 [-y verify result if this switch is on] 00:06:41.044 [-a tasks to allocate per core (default: same value as -q)] 00:06:41.044 Can be used to spread operations across a wider range of memory. 00:06:41.044 07:31:51 -- common/autotest_common.sh@653 -- # es=1 00:06:41.044 07:31:51 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:41.044 07:31:51 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:41.044 07:31:51 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:41.044 00:06:41.044 real 0m0.028s 00:06:41.044 user 0m0.010s 00:06:41.044 sys 0m0.017s 00:06:41.044 07:31:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.044 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 ************************************ 00:06:41.044 END TEST accel_negative_buffers 00:06:41.044 ************************************ 00:06:41.044 Error: writing output failed: Broken pipe 00:06:41.044 07:31:51 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:41.044 07:31:51 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:41.044 07:31:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.044 07:31:51 -- common/autotest_common.sh@10 -- # set +x 00:06:41.044 ************************************ 00:06:41.044 START TEST accel_crc32c 00:06:41.044 ************************************ 00:06:41.044 07:31:51 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:41.044 07:31:51 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.044 07:31:51 -- accel/accel.sh@17 -- # local accel_module 00:06:41.044 07:31:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:41.044 07:31:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:41.044 07:31:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.044 07:31:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.044 07:31:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.044 07:31:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.044 07:31:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.044 07:31:51 -- accel/accel.sh@42 -- # jq -r . 00:06:41.044 [2024-11-28 07:31:51.810286] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:41.044 [2024-11-28 07:31:51.810374] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656146 ] 00:06:41.303 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.303 [2024-11-28 07:31:51.879706] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.303 [2024-11-28 07:31:51.918769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.678 07:31:53 -- accel/accel.sh@18 -- # out=' 00:06:42.678 SPDK Configuration: 00:06:42.678 Core mask: 0x1 00:06:42.678 00:06:42.678 Accel Perf Configuration: 00:06:42.678 Workload Type: crc32c 00:06:42.678 CRC-32C seed: 32 00:06:42.678 Transfer size: 4096 bytes 00:06:42.678 Vector count 1 00:06:42.678 Module: software 00:06:42.678 Queue depth: 32 00:06:42.678 Allocate depth: 32 00:06:42.678 # threads/core: 1 00:06:42.678 Run time: 1 seconds 00:06:42.678 Verify: Yes 00:06:42.678 00:06:42.678 Running for 1 seconds... 00:06:42.678 00:06:42.678 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.678 ------------------------------------------------------------------------------------ 00:06:42.678 0,0 841408/s 3286 MiB/s 0 0 00:06:42.678 ==================================================================================== 00:06:42.678 Total 841408/s 3286 MiB/s 0 0' 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:42.678 07:31:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:42.678 07:31:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.678 07:31:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.678 07:31:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.678 07:31:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.678 07:31:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.678 07:31:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.678 07:31:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.678 07:31:53 -- accel/accel.sh@42 -- # jq -r . 00:06:42.678 [2024-11-28 07:31:53.099327] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:42.678 [2024-11-28 07:31:53.099418] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656414 ] 00:06:42.678 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.678 [2024-11-28 07:31:53.168083] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.678 [2024-11-28 07:31:53.203152] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=0x1 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=crc32c 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=32 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=software 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=32 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=32 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=1 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val=Yes 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:42.678 07:31:53 -- accel/accel.sh@21 -- # val= 00:06:42.678 07:31:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # IFS=: 00:06:42.678 07:31:53 -- accel/accel.sh@20 -- # read -r var val 00:06:43.613 07:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.613 07:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.613 07:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.613 07:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.613 07:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.613 07:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.613 07:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.613 07:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.613 07:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.613 07:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.613 07:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.613 07:31:54 -- accel/accel.sh@21 -- # val= 00:06:43.613 07:31:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.614 07:31:54 -- accel/accel.sh@20 -- # IFS=: 00:06:43.614 07:31:54 -- accel/accel.sh@20 -- # read -r var val 00:06:43.614 07:31:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.614 07:31:54 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:43.614 07:31:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.614 00:06:43.614 real 0m2.577s 00:06:43.614 user 0m2.326s 00:06:43.614 sys 0m0.261s 00:06:43.614 07:31:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:43.614 07:31:54 -- common/autotest_common.sh@10 -- # set +x 00:06:43.614 ************************************ 00:06:43.614 END TEST accel_crc32c 00:06:43.614 ************************************ 00:06:43.872 07:31:54 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:43.872 07:31:54 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:43.872 07:31:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.872 07:31:54 -- common/autotest_common.sh@10 -- # set +x 00:06:43.872 ************************************ 00:06:43.872 START TEST accel_crc32c_C2 00:06:43.872 ************************************ 00:06:43.872 07:31:54 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:43.872 07:31:54 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.872 07:31:54 -- accel/accel.sh@17 -- # local accel_module 00:06:43.872 07:31:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:43.872 07:31:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:43.872 07:31:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.872 07:31:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.872 07:31:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.872 07:31:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.872 07:31:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.872 07:31:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.872 07:31:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.872 07:31:54 -- accel/accel.sh@42 -- # jq -r . 00:06:43.872 [2024-11-28 07:31:54.437654] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:43.872 [2024-11-28 07:31:54.437744] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656701 ] 00:06:43.872 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.872 [2024-11-28 07:31:54.506951] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.872 [2024-11-28 07:31:54.542497] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.247 07:31:55 -- accel/accel.sh@18 -- # out=' 00:06:45.247 SPDK Configuration: 00:06:45.247 Core mask: 0x1 00:06:45.247 00:06:45.247 Accel Perf Configuration: 00:06:45.247 Workload Type: crc32c 00:06:45.247 CRC-32C seed: 0 00:06:45.247 Transfer size: 4096 bytes 00:06:45.247 Vector count 2 00:06:45.247 Module: software 00:06:45.247 Queue depth: 32 00:06:45.247 Allocate depth: 32 00:06:45.247 # threads/core: 1 00:06:45.247 Run time: 1 seconds 00:06:45.247 Verify: Yes 00:06:45.247 00:06:45.247 Running for 1 seconds... 00:06:45.247 00:06:45.247 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.247 ------------------------------------------------------------------------------------ 00:06:45.247 0,0 617120/s 4821 MiB/s 0 0 00:06:45.247 ==================================================================================== 00:06:45.247 Total 617120/s 2410 MiB/s 0 0' 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:45.247 07:31:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:45.247 07:31:55 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.247 07:31:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.247 07:31:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.247 07:31:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.247 07:31:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.247 07:31:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.247 07:31:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.247 07:31:55 -- accel/accel.sh@42 -- # jq -r . 00:06:45.247 [2024-11-28 07:31:55.720475] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:45.247 [2024-11-28 07:31:55.720566] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656973 ] 00:06:45.247 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.247 [2024-11-28 07:31:55.787831] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.247 [2024-11-28 07:31:55.822123] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=0x1 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=crc32c 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=0 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=software 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=32 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=32 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=1 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val=Yes 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:45.247 07:31:55 -- accel/accel.sh@21 -- # val= 00:06:45.247 07:31:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # IFS=: 00:06:45.247 07:31:55 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@21 -- # val= 00:06:46.623 07:31:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # IFS=: 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@21 -- # val= 00:06:46.623 07:31:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # IFS=: 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@21 -- # val= 00:06:46.623 07:31:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # IFS=: 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@21 -- # val= 00:06:46.623 07:31:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # IFS=: 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@21 -- # val= 00:06:46.623 07:31:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # IFS=: 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@21 -- # val= 00:06:46.623 07:31:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # IFS=: 00:06:46.623 07:31:56 -- accel/accel.sh@20 -- # read -r var val 00:06:46.623 07:31:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.623 07:31:56 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:46.624 07:31:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.624 00:06:46.624 real 0m2.565s 00:06:46.624 user 0m2.326s 00:06:46.624 sys 0m0.250s 00:06:46.624 07:31:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:46.624 07:31:56 -- common/autotest_common.sh@10 -- # set +x 00:06:46.624 ************************************ 00:06:46.624 END TEST accel_crc32c_C2 00:06:46.624 ************************************ 00:06:46.624 07:31:57 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:46.624 07:31:57 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:46.624 07:31:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:46.624 07:31:57 -- common/autotest_common.sh@10 -- # set +x 00:06:46.624 ************************************ 00:06:46.624 START TEST accel_copy 00:06:46.624 ************************************ 00:06:46.624 07:31:57 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:46.624 07:31:57 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.624 07:31:57 -- accel/accel.sh@17 -- # local accel_module 00:06:46.624 07:31:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:46.624 07:31:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:46.624 07:31:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.624 07:31:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.624 07:31:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.624 07:31:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.624 07:31:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.624 07:31:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.624 07:31:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.624 07:31:57 -- accel/accel.sh@42 -- # jq -r . 00:06:46.624 [2024-11-28 07:31:57.048213] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:46.624 [2024-11-28 07:31:57.048303] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657177 ] 00:06:46.624 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.624 [2024-11-28 07:31:57.117891] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.624 [2024-11-28 07:31:57.153528] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.560 07:31:58 -- accel/accel.sh@18 -- # out=' 00:06:47.560 SPDK Configuration: 00:06:47.560 Core mask: 0x1 00:06:47.560 00:06:47.560 Accel Perf Configuration: 00:06:47.560 Workload Type: copy 00:06:47.560 Transfer size: 4096 bytes 00:06:47.560 Vector count 1 00:06:47.560 Module: software 00:06:47.560 Queue depth: 32 00:06:47.560 Allocate depth: 32 00:06:47.560 # threads/core: 1 00:06:47.560 Run time: 1 seconds 00:06:47.560 Verify: Yes 00:06:47.560 00:06:47.560 Running for 1 seconds... 00:06:47.560 00:06:47.560 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.560 ------------------------------------------------------------------------------------ 00:06:47.560 0,0 555072/s 2168 MiB/s 0 0 00:06:47.560 ==================================================================================== 00:06:47.560 Total 555072/s 2168 MiB/s 0 0' 00:06:47.560 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.560 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.560 07:31:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:47.560 07:31:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:47.560 07:31:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.560 07:31:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.560 07:31:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.560 07:31:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.560 07:31:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.560 07:31:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.560 07:31:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.560 07:31:58 -- accel/accel.sh@42 -- # jq -r . 00:06:47.820 [2024-11-28 07:31:58.331682] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:47.820 [2024-11-28 07:31:58.331778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657331 ] 00:06:47.820 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.820 [2024-11-28 07:31:58.399928] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.820 [2024-11-28 07:31:58.435454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=0x1 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=copy 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=software 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=32 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=32 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=1 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val=Yes 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:47.820 07:31:58 -- accel/accel.sh@21 -- # val= 00:06:47.820 07:31:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # IFS=: 00:06:47.820 07:31:58 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@21 -- # val= 00:06:49.197 07:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@21 -- # val= 00:06:49.197 07:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@21 -- # val= 00:06:49.197 07:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@21 -- # val= 00:06:49.197 07:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@21 -- # val= 00:06:49.197 07:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@21 -- # val= 00:06:49.197 07:31:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # IFS=: 00:06:49.197 07:31:59 -- accel/accel.sh@20 -- # read -r var val 00:06:49.197 07:31:59 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.197 07:31:59 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:49.197 07:31:59 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.197 00:06:49.197 real 0m2.568s 00:06:49.197 user 0m2.309s 00:06:49.197 sys 0m0.258s 00:06:49.197 07:31:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.197 07:31:59 -- common/autotest_common.sh@10 -- # set +x 00:06:49.197 ************************************ 00:06:49.197 END TEST accel_copy 00:06:49.197 ************************************ 00:06:49.197 07:31:59 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:49.197 07:31:59 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:49.197 07:31:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:49.197 07:31:59 -- common/autotest_common.sh@10 -- # set +x 00:06:49.197 ************************************ 00:06:49.197 START TEST accel_fill 00:06:49.197 ************************************ 00:06:49.197 07:31:59 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:49.197 07:31:59 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.197 07:31:59 -- accel/accel.sh@17 -- # local accel_module 00:06:49.197 07:31:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:49.197 07:31:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:49.197 07:31:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.197 07:31:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.197 07:31:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.197 07:31:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.197 07:31:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.197 07:31:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.197 07:31:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.197 07:31:59 -- accel/accel.sh@42 -- # jq -r . 00:06:49.197 [2024-11-28 07:31:59.657196] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:49.197 [2024-11-28 07:31:59.657304] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657566 ] 00:06:49.197 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.197 [2024-11-28 07:31:59.725746] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.197 [2024-11-28 07:31:59.761173] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.573 07:32:00 -- accel/accel.sh@18 -- # out=' 00:06:50.573 SPDK Configuration: 00:06:50.573 Core mask: 0x1 00:06:50.573 00:06:50.573 Accel Perf Configuration: 00:06:50.573 Workload Type: fill 00:06:50.573 Fill pattern: 0x80 00:06:50.573 Transfer size: 4096 bytes 00:06:50.573 Vector count 1 00:06:50.573 Module: software 00:06:50.573 Queue depth: 64 00:06:50.573 Allocate depth: 64 00:06:50.573 # threads/core: 1 00:06:50.573 Run time: 1 seconds 00:06:50.573 Verify: Yes 00:06:50.573 00:06:50.573 Running for 1 seconds... 00:06:50.573 00:06:50.573 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.573 ------------------------------------------------------------------------------------ 00:06:50.573 0,0 943936/s 3687 MiB/s 0 0 00:06:50.573 ==================================================================================== 00:06:50.573 Total 943936/s 3687 MiB/s 0 0' 00:06:50.573 07:32:00 -- accel/accel.sh@20 -- # IFS=: 00:06:50.573 07:32:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.573 07:32:00 -- accel/accel.sh@20 -- # read -r var val 00:06:50.573 07:32:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.573 07:32:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.573 07:32:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.573 07:32:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.573 07:32:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.573 07:32:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.573 07:32:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.573 07:32:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.573 07:32:00 -- accel/accel.sh@42 -- # jq -r . 00:06:50.573 [2024-11-28 07:32:00.929022] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:50.573 [2024-11-28 07:32:00.929076] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657832 ] 00:06:50.573 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.573 [2024-11-28 07:32:00.990447] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.573 [2024-11-28 07:32:01.024959] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.573 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.573 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.573 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.573 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.573 07:32:01 -- accel/accel.sh@21 -- # val=0x1 00:06:50.573 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.573 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.573 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.573 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.573 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=fill 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=0x80 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=software 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@23 -- # accel_module=software 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=64 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=64 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=1 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val=Yes 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 07:32:01 -- accel/accel.sh@21 -- # val= 00:06:50.574 07:32:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 07:32:01 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.508 07:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.508 07:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.508 07:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.508 07:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.508 07:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@21 -- # val= 00:06:51.508 07:32:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # IFS=: 00:06:51.508 07:32:02 -- accel/accel.sh@20 -- # read -r var val 00:06:51.508 07:32:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.508 07:32:02 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:51.508 07:32:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.508 00:06:51.508 real 0m2.548s 00:06:51.508 user 0m2.306s 00:06:51.508 sys 0m0.240s 00:06:51.508 07:32:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.508 07:32:02 -- common/autotest_common.sh@10 -- # set +x 00:06:51.508 ************************************ 00:06:51.508 END TEST accel_fill 00:06:51.508 ************************************ 00:06:51.508 07:32:02 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:51.508 07:32:02 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:51.508 07:32:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.508 07:32:02 -- common/autotest_common.sh@10 -- # set +x 00:06:51.508 ************************************ 00:06:51.508 START TEST accel_copy_crc32c 00:06:51.508 ************************************ 00:06:51.508 07:32:02 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:51.508 07:32:02 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.508 07:32:02 -- accel/accel.sh@17 -- # local accel_module 00:06:51.508 07:32:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:51.508 07:32:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:51.508 07:32:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.508 07:32:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.508 07:32:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.508 07:32:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.508 07:32:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.508 07:32:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.508 07:32:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.508 07:32:02 -- accel/accel.sh@42 -- # jq -r . 00:06:51.508 [2024-11-28 07:32:02.248159] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:51.508 [2024-11-28 07:32:02.248250] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658115 ] 00:06:51.767 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.767 [2024-11-28 07:32:02.315755] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.767 [2024-11-28 07:32:02.350237] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.143 07:32:03 -- accel/accel.sh@18 -- # out=' 00:06:53.143 SPDK Configuration: 00:06:53.143 Core mask: 0x1 00:06:53.143 00:06:53.143 Accel Perf Configuration: 00:06:53.143 Workload Type: copy_crc32c 00:06:53.143 CRC-32C seed: 0 00:06:53.143 Vector size: 4096 bytes 00:06:53.143 Transfer size: 4096 bytes 00:06:53.143 Vector count 1 00:06:53.143 Module: software 00:06:53.143 Queue depth: 32 00:06:53.143 Allocate depth: 32 00:06:53.143 # threads/core: 1 00:06:53.143 Run time: 1 seconds 00:06:53.143 Verify: Yes 00:06:53.143 00:06:53.143 Running for 1 seconds... 00:06:53.143 00:06:53.143 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.143 ------------------------------------------------------------------------------------ 00:06:53.143 0,0 421984/s 1648 MiB/s 0 0 00:06:53.143 ==================================================================================== 00:06:53.143 Total 421984/s 1648 MiB/s 0 0' 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:53.143 07:32:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.143 07:32:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.143 07:32:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.143 07:32:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.143 07:32:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.143 07:32:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.143 07:32:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.143 07:32:03 -- accel/accel.sh@42 -- # jq -r . 00:06:53.143 [2024-11-28 07:32:03.517815] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:53.143 [2024-11-28 07:32:03.517869] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658387 ] 00:06:53.143 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.143 [2024-11-28 07:32:03.579421] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.143 [2024-11-28 07:32:03.614723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=0x1 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=0 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=software 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=32 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=32 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val=1 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.143 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.143 07:32:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.143 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.144 07:32:03 -- accel/accel.sh@21 -- # val=Yes 00:06:53.144 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.144 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.144 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:53.144 07:32:03 -- accel/accel.sh@21 -- # val= 00:06:53.144 07:32:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # IFS=: 00:06:53.144 07:32:03 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@21 -- # val= 00:06:54.078 07:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@21 -- # val= 00:06:54.078 07:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@21 -- # val= 00:06:54.078 07:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@21 -- # val= 00:06:54.078 07:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@21 -- # val= 00:06:54.078 07:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@21 -- # val= 00:06:54.078 07:32:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # IFS=: 00:06:54.078 07:32:04 -- accel/accel.sh@20 -- # read -r var val 00:06:54.078 07:32:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.078 07:32:04 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:54.078 07:32:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.078 00:06:54.078 real 0m2.551s 00:06:54.078 user 0m2.309s 00:06:54.078 sys 0m0.241s 00:06:54.078 07:32:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:54.078 07:32:04 -- common/autotest_common.sh@10 -- # set +x 00:06:54.078 ************************************ 00:06:54.078 END TEST accel_copy_crc32c 00:06:54.078 ************************************ 00:06:54.078 07:32:04 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:54.078 07:32:04 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:54.078 07:32:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.078 07:32:04 -- common/autotest_common.sh@10 -- # set +x 00:06:54.078 ************************************ 00:06:54.078 START TEST accel_copy_crc32c_C2 00:06:54.078 ************************************ 00:06:54.078 07:32:04 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:54.078 07:32:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.078 07:32:04 -- accel/accel.sh@17 -- # local accel_module 00:06:54.078 07:32:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:54.078 07:32:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:54.078 07:32:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.078 07:32:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.078 07:32:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.078 07:32:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.078 07:32:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.078 07:32:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.078 07:32:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.078 07:32:04 -- accel/accel.sh@42 -- # jq -r . 00:06:54.078 [2024-11-28 07:32:04.837217] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:54.078 [2024-11-28 07:32:04.837307] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658654 ] 00:06:54.337 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.337 [2024-11-28 07:32:04.907206] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.337 [2024-11-28 07:32:04.943621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.711 07:32:06 -- accel/accel.sh@18 -- # out=' 00:06:55.711 SPDK Configuration: 00:06:55.711 Core mask: 0x1 00:06:55.711 00:06:55.711 Accel Perf Configuration: 00:06:55.711 Workload Type: copy_crc32c 00:06:55.711 CRC-32C seed: 0 00:06:55.711 Vector size: 4096 bytes 00:06:55.711 Transfer size: 8192 bytes 00:06:55.711 Vector count 2 00:06:55.711 Module: software 00:06:55.711 Queue depth: 32 00:06:55.711 Allocate depth: 32 00:06:55.711 # threads/core: 1 00:06:55.711 Run time: 1 seconds 00:06:55.711 Verify: Yes 00:06:55.711 00:06:55.711 Running for 1 seconds... 00:06:55.711 00:06:55.711 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.711 ------------------------------------------------------------------------------------ 00:06:55.711 0,0 301792/s 2357 MiB/s 0 0 00:06:55.711 ==================================================================================== 00:06:55.711 Total 301792/s 1178 MiB/s 0 0' 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:55.711 07:32:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:55.711 07:32:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.711 07:32:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.711 07:32:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.711 07:32:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.711 07:32:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.711 07:32:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.711 07:32:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.711 07:32:06 -- accel/accel.sh@42 -- # jq -r . 00:06:55.711 [2024-11-28 07:32:06.114099] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:55.711 [2024-11-28 07:32:06.114166] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658797 ] 00:06:55.711 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.711 [2024-11-28 07:32:06.177356] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.711 [2024-11-28 07:32:06.212222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=0x1 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=0 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=software 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=32 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=32 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=1 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val=Yes 00:06:55.711 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.711 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.711 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.712 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.712 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.712 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:55.712 07:32:06 -- accel/accel.sh@21 -- # val= 00:06:55.712 07:32:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.712 07:32:06 -- accel/accel.sh@20 -- # IFS=: 00:06:55.712 07:32:06 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.648 07:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.648 07:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.648 07:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.648 07:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.648 07:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@21 -- # val= 00:06:56.648 07:32:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # IFS=: 00:06:56.648 07:32:07 -- accel/accel.sh@20 -- # read -r var val 00:06:56.648 07:32:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.648 07:32:07 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:56.648 07:32:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.648 00:06:56.648 real 0m2.556s 00:06:56.648 user 0m2.321s 00:06:56.648 sys 0m0.235s 00:06:56.648 07:32:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:56.648 07:32:07 -- common/autotest_common.sh@10 -- # set +x 00:06:56.648 ************************************ 00:06:56.648 END TEST accel_copy_crc32c_C2 00:06:56.648 ************************************ 00:06:56.648 07:32:07 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:56.648 07:32:07 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:56.648 07:32:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:56.648 07:32:07 -- common/autotest_common.sh@10 -- # set +x 00:06:56.648 ************************************ 00:06:56.648 START TEST accel_dualcast 00:06:56.648 ************************************ 00:06:56.648 07:32:07 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:56.648 07:32:07 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.648 07:32:07 -- accel/accel.sh@17 -- # local accel_module 00:06:56.648 07:32:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:56.648 07:32:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:56.648 07:32:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.648 07:32:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.648 07:32:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.648 07:32:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.648 07:32:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.648 07:32:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.907 07:32:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.907 07:32:07 -- accel/accel.sh@42 -- # jq -r . 00:06:56.907 [2024-11-28 07:32:07.432710] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:56.907 [2024-11-28 07:32:07.432797] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658975 ] 00:06:56.907 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.907 [2024-11-28 07:32:07.500644] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.907 [2024-11-28 07:32:07.535934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.314 07:32:08 -- accel/accel.sh@18 -- # out=' 00:06:58.314 SPDK Configuration: 00:06:58.314 Core mask: 0x1 00:06:58.314 00:06:58.314 Accel Perf Configuration: 00:06:58.314 Workload Type: dualcast 00:06:58.314 Transfer size: 4096 bytes 00:06:58.314 Vector count 1 00:06:58.314 Module: software 00:06:58.314 Queue depth: 32 00:06:58.314 Allocate depth: 32 00:06:58.314 # threads/core: 1 00:06:58.314 Run time: 1 seconds 00:06:58.314 Verify: Yes 00:06:58.314 00:06:58.314 Running for 1 seconds... 00:06:58.314 00:06:58.314 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.314 ------------------------------------------------------------------------------------ 00:06:58.314 0,0 628544/s 2455 MiB/s 0 0 00:06:58.314 ==================================================================================== 00:06:58.314 Total 628544/s 2455 MiB/s 0 0' 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:58.314 07:32:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:58.314 07:32:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.314 07:32:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.314 07:32:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.314 07:32:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.314 07:32:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.314 07:32:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.314 07:32:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.314 07:32:08 -- accel/accel.sh@42 -- # jq -r . 00:06:58.314 [2024-11-28 07:32:08.705297] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:58.314 [2024-11-28 07:32:08.705359] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659002 ] 00:06:58.314 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.314 [2024-11-28 07:32:08.769505] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.314 [2024-11-28 07:32:08.803787] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=0x1 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=dualcast 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=software 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=32 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=32 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=1 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val=Yes 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:58.314 07:32:08 -- accel/accel.sh@21 -- # val= 00:06:58.314 07:32:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # IFS=: 00:06:58.314 07:32:08 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@21 -- # val= 00:06:59.252 07:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@21 -- # val= 00:06:59.252 07:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@21 -- # val= 00:06:59.252 07:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@21 -- # val= 00:06:59.252 07:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@21 -- # val= 00:06:59.252 07:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@21 -- # val= 00:06:59.252 07:32:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # IFS=: 00:06:59.252 07:32:09 -- accel/accel.sh@20 -- # read -r var val 00:06:59.252 07:32:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:59.252 07:32:09 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:59.252 07:32:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.252 00:06:59.252 real 0m2.552s 00:06:59.252 user 0m2.309s 00:06:59.252 sys 0m0.243s 00:06:59.252 07:32:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:59.252 07:32:09 -- common/autotest_common.sh@10 -- # set +x 00:06:59.252 ************************************ 00:06:59.252 END TEST accel_dualcast 00:06:59.252 ************************************ 00:06:59.252 07:32:10 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:59.252 07:32:10 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:59.252 07:32:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.252 07:32:10 -- common/autotest_common.sh@10 -- # set +x 00:06:59.252 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:06:59.252 ************************************ 00:06:59.252 START TEST accel_compare 00:06:59.252 ************************************ 00:06:59.252 07:32:10 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:59.252 07:32:10 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.252 07:32:10 -- accel/accel.sh@17 -- # local accel_module 00:06:59.252 07:32:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:59.252 07:32:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:59.252 07:32:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.252 07:32:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.252 07:32:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.252 07:32:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.252 07:32:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.252 07:32:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.252 07:32:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.252 07:32:10 -- accel/accel.sh@42 -- # jq -r . 00:06:59.510 [2024-11-28 07:32:10.030010] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:59.510 [2024-11-28 07:32:10.030091] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659036 ] 00:06:59.510 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.510 [2024-11-28 07:32:10.099964] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.510 [2024-11-28 07:32:10.136466] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.886 07:32:11 -- accel/accel.sh@18 -- # out=' 00:07:00.886 SPDK Configuration: 00:07:00.886 Core mask: 0x1 00:07:00.886 00:07:00.886 Accel Perf Configuration: 00:07:00.886 Workload Type: compare 00:07:00.886 Transfer size: 4096 bytes 00:07:00.886 Vector count 1 00:07:00.886 Module: software 00:07:00.886 Queue depth: 32 00:07:00.886 Allocate depth: 32 00:07:00.886 # threads/core: 1 00:07:00.886 Run time: 1 seconds 00:07:00.886 Verify: Yes 00:07:00.886 00:07:00.886 Running for 1 seconds... 00:07:00.886 00:07:00.886 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.886 ------------------------------------------------------------------------------------ 00:07:00.886 0,0 813312/s 3177 MiB/s 0 0 00:07:00.886 ==================================================================================== 00:07:00.886 Total 813312/s 3177 MiB/s 0 0' 00:07:00.886 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.886 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.886 07:32:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:00.886 07:32:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:00.886 07:32:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.886 07:32:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.886 07:32:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.887 07:32:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.887 07:32:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.887 07:32:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.887 07:32:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.887 07:32:11 -- accel/accel.sh@42 -- # jq -r . 00:07:00.887 [2024-11-28 07:32:11.306453] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:00.887 [2024-11-28 07:32:11.306512] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659057 ] 00:07:00.887 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.887 [2024-11-28 07:32:11.370174] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.887 [2024-11-28 07:32:11.404665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=0x1 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=compare 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@24 -- # accel_opc=compare 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=software 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=32 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=32 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=1 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val=Yes 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:00.887 07:32:11 -- accel/accel.sh@21 -- # val= 00:07:00.887 07:32:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # IFS=: 00:07:00.887 07:32:11 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.823 07:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.823 07:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.823 07:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.823 07:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.823 07:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@21 -- # val= 00:07:01.823 07:32:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # IFS=: 00:07:01.823 07:32:12 -- accel/accel.sh@20 -- # read -r var val 00:07:01.823 07:32:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.823 07:32:12 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:07:01.823 07:32:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.823 00:07:01.823 real 0m2.556s 00:07:01.823 user 0m2.314s 00:07:01.823 sys 0m0.241s 00:07:01.823 07:32:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:01.823 07:32:12 -- common/autotest_common.sh@10 -- # set +x 00:07:01.823 ************************************ 00:07:01.823 END TEST accel_compare 00:07:01.823 ************************************ 00:07:02.082 07:32:12 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:02.082 07:32:12 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:02.082 07:32:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.082 07:32:12 -- common/autotest_common.sh@10 -- # set +x 00:07:02.082 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:02.082 ************************************ 00:07:02.082 START TEST accel_xor 00:07:02.082 ************************************ 00:07:02.082 07:32:12 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:07:02.082 07:32:12 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.082 07:32:12 -- accel/accel.sh@17 -- # local accel_module 00:07:02.082 07:32:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:07:02.082 07:32:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:02.082 07:32:12 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.082 07:32:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.082 07:32:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.082 07:32:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.082 07:32:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.082 07:32:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.082 07:32:12 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.082 07:32:12 -- accel/accel.sh@42 -- # jq -r . 00:07:02.082 [2024-11-28 07:32:12.625138] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:02.082 [2024-11-28 07:32:12.625228] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659091 ] 00:07:02.082 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.082 [2024-11-28 07:32:12.693854] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.082 [2024-11-28 07:32:12.729647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.463 07:32:13 -- accel/accel.sh@18 -- # out=' 00:07:03.463 SPDK Configuration: 00:07:03.463 Core mask: 0x1 00:07:03.463 00:07:03.463 Accel Perf Configuration: 00:07:03.463 Workload Type: xor 00:07:03.463 Source buffers: 2 00:07:03.463 Transfer size: 4096 bytes 00:07:03.463 Vector count 1 00:07:03.463 Module: software 00:07:03.463 Queue depth: 32 00:07:03.463 Allocate depth: 32 00:07:03.463 # threads/core: 1 00:07:03.463 Run time: 1 seconds 00:07:03.463 Verify: Yes 00:07:03.463 00:07:03.463 Running for 1 seconds... 00:07:03.463 00:07:03.463 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.463 ------------------------------------------------------------------------------------ 00:07:03.463 0,0 709216/s 2770 MiB/s 0 0 00:07:03.463 ==================================================================================== 00:07:03.463 Total 709216/s 2770 MiB/s 0 0' 00:07:03.463 07:32:13 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:13 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:03.463 07:32:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:03.463 07:32:13 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.463 07:32:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.463 07:32:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.463 07:32:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.463 07:32:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.463 07:32:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.463 07:32:13 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.463 07:32:13 -- accel/accel.sh@42 -- # jq -r . 00:07:03.463 [2024-11-28 07:32:13.908721] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:03.463 [2024-11-28 07:32:13.908813] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659120 ] 00:07:03.463 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.463 [2024-11-28 07:32:13.977271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.463 [2024-11-28 07:32:14.011757] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=0x1 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=xor 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=2 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=software 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=32 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=32 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=1 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val=Yes 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:03.463 07:32:14 -- accel/accel.sh@21 -- # val= 00:07:03.463 07:32:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # IFS=: 00:07:03.463 07:32:14 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.843 07:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.843 07:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.843 07:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.843 07:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.843 07:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@21 -- # val= 00:07:04.843 07:32:15 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # IFS=: 00:07:04.843 07:32:15 -- accel/accel.sh@20 -- # read -r var val 00:07:04.843 07:32:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.843 07:32:15 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:04.843 07:32:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.843 00:07:04.843 real 0m2.569s 00:07:04.843 user 0m2.317s 00:07:04.843 sys 0m0.251s 00:07:04.843 07:32:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.843 07:32:15 -- common/autotest_common.sh@10 -- # set +x 00:07:04.843 ************************************ 00:07:04.843 END TEST accel_xor 00:07:04.843 ************************************ 00:07:04.843 07:32:15 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:04.843 07:32:15 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:04.843 07:32:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.843 07:32:15 -- common/autotest_common.sh@10 -- # set +x 00:07:04.843 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:04.843 ************************************ 00:07:04.843 START TEST accel_xor 00:07:04.843 ************************************ 00:07:04.843 07:32:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:07:04.843 07:32:15 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.843 07:32:15 -- accel/accel.sh@17 -- # local accel_module 00:07:04.843 07:32:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:04.843 07:32:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:04.843 07:32:15 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.843 07:32:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.843 07:32:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.843 07:32:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.843 07:32:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.843 07:32:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.843 07:32:15 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.843 07:32:15 -- accel/accel.sh@42 -- # jq -r . 00:07:04.843 [2024-11-28 07:32:15.235247] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:04.843 [2024-11-28 07:32:15.235339] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659157 ] 00:07:04.843 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.843 [2024-11-28 07:32:15.303540] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.843 [2024-11-28 07:32:15.338719] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.848 07:32:16 -- accel/accel.sh@18 -- # out=' 00:07:05.848 SPDK Configuration: 00:07:05.848 Core mask: 0x1 00:07:05.848 00:07:05.848 Accel Perf Configuration: 00:07:05.848 Workload Type: xor 00:07:05.848 Source buffers: 3 00:07:05.848 Transfer size: 4096 bytes 00:07:05.848 Vector count 1 00:07:05.848 Module: software 00:07:05.848 Queue depth: 32 00:07:05.848 Allocate depth: 32 00:07:05.848 # threads/core: 1 00:07:05.848 Run time: 1 seconds 00:07:05.848 Verify: Yes 00:07:05.848 00:07:05.848 Running for 1 seconds... 00:07:05.848 00:07:05.848 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.848 ------------------------------------------------------------------------------------ 00:07:05.848 0,0 664288/s 2594 MiB/s 0 0 00:07:05.848 ==================================================================================== 00:07:05.848 Total 664288/s 2594 MiB/s 0 0' 00:07:05.848 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:05.848 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:05.848 07:32:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:05.848 07:32:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:05.848 07:32:16 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.848 07:32:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.848 07:32:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.848 07:32:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.848 07:32:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.848 07:32:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.848 07:32:16 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.848 07:32:16 -- accel/accel.sh@42 -- # jq -r . 00:07:05.848 [2024-11-28 07:32:16.517476] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:05.848 [2024-11-28 07:32:16.517568] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659176 ] 00:07:05.848 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.120 [2024-11-28 07:32:16.586759] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.120 [2024-11-28 07:32:16.622092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val=0x1 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val=xor 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val=3 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val=software 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@23 -- # accel_module=software 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val=32 00:07:06.120 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.120 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.120 07:32:16 -- accel/accel.sh@21 -- # val=32 00:07:06.121 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.121 07:32:16 -- accel/accel.sh@21 -- # val=1 00:07:06.121 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.121 07:32:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:06.121 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.121 07:32:16 -- accel/accel.sh@21 -- # val=Yes 00:07:06.121 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.121 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.121 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:06.121 07:32:16 -- accel/accel.sh@21 -- # val= 00:07:06.121 07:32:16 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # IFS=: 00:07:06.121 07:32:16 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@21 -- # val= 00:07:07.105 07:32:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # IFS=: 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@21 -- # val= 00:07:07.105 07:32:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # IFS=: 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@21 -- # val= 00:07:07.105 07:32:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # IFS=: 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@21 -- # val= 00:07:07.105 07:32:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # IFS=: 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@21 -- # val= 00:07:07.105 07:32:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # IFS=: 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@21 -- # val= 00:07:07.105 07:32:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # IFS=: 00:07:07.105 07:32:17 -- accel/accel.sh@20 -- # read -r var val 00:07:07.105 07:32:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:07.105 07:32:17 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:07.105 07:32:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.105 00:07:07.105 real 0m2.568s 00:07:07.105 user 0m2.309s 00:07:07.105 sys 0m0.259s 00:07:07.105 07:32:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:07.105 07:32:17 -- common/autotest_common.sh@10 -- # set +x 00:07:07.105 ************************************ 00:07:07.105 END TEST accel_xor 00:07:07.105 ************************************ 00:07:07.105 07:32:17 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:07.105 07:32:17 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:07.105 07:32:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:07.105 07:32:17 -- common/autotest_common.sh@10 -- # set +x 00:07:07.105 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:07.105 ************************************ 00:07:07.105 START TEST accel_dif_verify 00:07:07.105 ************************************ 00:07:07.105 07:32:17 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:07:07.105 07:32:17 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.105 07:32:17 -- accel/accel.sh@17 -- # local accel_module 00:07:07.105 07:32:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:07.105 07:32:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:07.105 07:32:17 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.105 07:32:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.105 07:32:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.105 07:32:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.105 07:32:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.105 07:32:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.105 07:32:17 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.105 07:32:17 -- accel/accel.sh@42 -- # jq -r . 00:07:07.105 [2024-11-28 07:32:17.845413] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:07.105 [2024-11-28 07:32:17.845507] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659218 ] 00:07:07.364 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.364 [2024-11-28 07:32:17.914527] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.364 [2024-11-28 07:32:17.950341] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.740 07:32:19 -- accel/accel.sh@18 -- # out=' 00:07:08.740 SPDK Configuration: 00:07:08.740 Core mask: 0x1 00:07:08.740 00:07:08.740 Accel Perf Configuration: 00:07:08.740 Workload Type: dif_verify 00:07:08.740 Vector size: 4096 bytes 00:07:08.740 Transfer size: 4096 bytes 00:07:08.740 Block size: 512 bytes 00:07:08.740 Metadata size: 8 bytes 00:07:08.740 Vector count 1 00:07:08.740 Module: software 00:07:08.740 Queue depth: 32 00:07:08.740 Allocate depth: 32 00:07:08.740 # threads/core: 1 00:07:08.740 Run time: 1 seconds 00:07:08.740 Verify: No 00:07:08.740 00:07:08.740 Running for 1 seconds... 00:07:08.740 00:07:08.740 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:08.740 ------------------------------------------------------------------------------------ 00:07:08.740 0,0 248672/s 986 MiB/s 0 0 00:07:08.740 ==================================================================================== 00:07:08.740 Total 248672/s 971 MiB/s 0 0' 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:08.740 07:32:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:08.740 07:32:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.740 07:32:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.740 07:32:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.740 07:32:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.740 07:32:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.740 07:32:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.740 07:32:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.740 07:32:19 -- accel/accel.sh@42 -- # jq -r . 00:07:08.740 [2024-11-28 07:32:19.128209] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:08.740 [2024-11-28 07:32:19.128302] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659243 ] 00:07:08.740 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.740 [2024-11-28 07:32:19.196881] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.740 [2024-11-28 07:32:19.231102] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val=0x1 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val=dif_verify 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val=software 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@23 -- # accel_module=software 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val=32 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val=32 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.740 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.740 07:32:19 -- accel/accel.sh@21 -- # val=1 00:07:08.740 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.741 07:32:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:08.741 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.741 07:32:19 -- accel/accel.sh@21 -- # val=No 00:07:08.741 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.741 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.741 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:08.741 07:32:19 -- accel/accel.sh@21 -- # val= 00:07:08.741 07:32:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # IFS=: 00:07:08.741 07:32:19 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.676 07:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.676 07:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.676 07:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.676 07:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.676 07:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@21 -- # val= 00:07:09.676 07:32:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # IFS=: 00:07:09.676 07:32:20 -- accel/accel.sh@20 -- # read -r var val 00:07:09.676 07:32:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:09.676 07:32:20 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:09.676 07:32:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.676 00:07:09.676 real 0m2.567s 00:07:09.676 user 0m2.313s 00:07:09.676 sys 0m0.254s 00:07:09.676 07:32:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:09.676 07:32:20 -- common/autotest_common.sh@10 -- # set +x 00:07:09.676 ************************************ 00:07:09.676 END TEST accel_dif_verify 00:07:09.676 ************************************ 00:07:09.676 07:32:20 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:09.676 07:32:20 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:09.676 07:32:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:09.676 07:32:20 -- common/autotest_common.sh@10 -- # set +x 00:07:09.676 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:09.676 ************************************ 00:07:09.676 START TEST accel_dif_generate 00:07:09.676 ************************************ 00:07:09.676 07:32:20 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:09.676 07:32:20 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.676 07:32:20 -- accel/accel.sh@17 -- # local accel_module 00:07:09.676 07:32:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:09.676 07:32:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:09.676 07:32:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.676 07:32:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.677 07:32:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.677 07:32:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.677 07:32:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.677 07:32:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.677 07:32:20 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.677 07:32:20 -- accel/accel.sh@42 -- # jq -r . 00:07:09.935 [2024-11-28 07:32:20.446561] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:09.935 [2024-11-28 07:32:20.446660] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659279 ] 00:07:09.935 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.935 [2024-11-28 07:32:20.514201] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.935 [2024-11-28 07:32:20.549353] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.310 07:32:21 -- accel/accel.sh@18 -- # out=' 00:07:11.310 SPDK Configuration: 00:07:11.310 Core mask: 0x1 00:07:11.310 00:07:11.310 Accel Perf Configuration: 00:07:11.310 Workload Type: dif_generate 00:07:11.310 Vector size: 4096 bytes 00:07:11.310 Transfer size: 4096 bytes 00:07:11.310 Block size: 512 bytes 00:07:11.310 Metadata size: 8 bytes 00:07:11.310 Vector count 1 00:07:11.310 Module: software 00:07:11.310 Queue depth: 32 00:07:11.310 Allocate depth: 32 00:07:11.311 # threads/core: 1 00:07:11.311 Run time: 1 seconds 00:07:11.311 Verify: No 00:07:11.311 00:07:11.311 Running for 1 seconds... 00:07:11.311 00:07:11.311 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:11.311 ------------------------------------------------------------------------------------ 00:07:11.311 0,0 283712/s 1125 MiB/s 0 0 00:07:11.311 ==================================================================================== 00:07:11.311 Total 283712/s 1108 MiB/s 0 0' 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:11.311 07:32:21 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.311 07:32:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.311 07:32:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.311 07:32:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.311 07:32:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.311 07:32:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.311 07:32:21 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.311 07:32:21 -- accel/accel.sh@42 -- # jq -r . 00:07:11.311 [2024-11-28 07:32:21.727829] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:11.311 [2024-11-28 07:32:21.727922] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659298 ] 00:07:11.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.311 [2024-11-28 07:32:21.795859] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.311 [2024-11-28 07:32:21.830106] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=0x1 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=dif_generate 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=software 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@23 -- # accel_module=software 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=32 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=32 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=1 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val=No 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:11.311 07:32:21 -- accel/accel.sh@21 -- # val= 00:07:11.311 07:32:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # IFS=: 00:07:11.311 07:32:21 -- accel/accel.sh@20 -- # read -r var val 00:07:12.244 07:32:22 -- accel/accel.sh@21 -- # val= 00:07:12.244 07:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.245 07:32:22 -- accel/accel.sh@21 -- # val= 00:07:12.245 07:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.245 07:32:22 -- accel/accel.sh@21 -- # val= 00:07:12.245 07:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.245 07:32:22 -- accel/accel.sh@21 -- # val= 00:07:12.245 07:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.245 07:32:22 -- accel/accel.sh@21 -- # val= 00:07:12.245 07:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.245 07:32:22 -- accel/accel.sh@21 -- # val= 00:07:12.245 07:32:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # IFS=: 00:07:12.245 07:32:22 -- accel/accel.sh@20 -- # read -r var val 00:07:12.245 07:32:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.245 07:32:22 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:12.245 07:32:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.245 00:07:12.245 real 0m2.564s 00:07:12.245 user 0m2.302s 00:07:12.245 sys 0m0.262s 00:07:12.245 07:32:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:12.245 07:32:22 -- common/autotest_common.sh@10 -- # set +x 00:07:12.245 ************************************ 00:07:12.245 END TEST accel_dif_generate 00:07:12.245 ************************************ 00:07:12.503 07:32:23 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:12.503 07:32:23 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:12.503 07:32:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.503 07:32:23 -- common/autotest_common.sh@10 -- # set +x 00:07:12.503 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:12.503 ************************************ 00:07:12.503 START TEST accel_dif_generate_copy 00:07:12.503 ************************************ 00:07:12.503 07:32:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:12.503 07:32:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.503 07:32:23 -- accel/accel.sh@17 -- # local accel_module 00:07:12.503 07:32:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:12.503 07:32:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:12.503 07:32:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.503 07:32:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.503 07:32:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.503 07:32:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.503 07:32:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.503 07:32:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.503 07:32:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.503 07:32:23 -- accel/accel.sh@42 -- # jq -r . 00:07:12.503 [2024-11-28 07:32:23.041926] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:12.503 [2024-11-28 07:32:23.041985] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659336 ] 00:07:12.503 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.503 [2024-11-28 07:32:23.101226] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.503 [2024-11-28 07:32:23.136563] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.881 07:32:24 -- accel/accel.sh@18 -- # out=' 00:07:13.881 SPDK Configuration: 00:07:13.881 Core mask: 0x1 00:07:13.881 00:07:13.881 Accel Perf Configuration: 00:07:13.881 Workload Type: dif_generate_copy 00:07:13.881 Vector size: 4096 bytes 00:07:13.881 Transfer size: 4096 bytes 00:07:13.881 Vector count 1 00:07:13.881 Module: software 00:07:13.881 Queue depth: 32 00:07:13.881 Allocate depth: 32 00:07:13.881 # threads/core: 1 00:07:13.881 Run time: 1 seconds 00:07:13.881 Verify: No 00:07:13.881 00:07:13.881 Running for 1 seconds... 00:07:13.881 00:07:13.881 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.881 ------------------------------------------------------------------------------------ 00:07:13.881 0,0 229248/s 909 MiB/s 0 0 00:07:13.881 ==================================================================================== 00:07:13.881 Total 229248/s 895 MiB/s 0 0' 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:13.881 07:32:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.881 07:32:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.881 07:32:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.881 07:32:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.881 07:32:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.881 07:32:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.881 07:32:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.881 07:32:24 -- accel/accel.sh@42 -- # jq -r . 00:07:13.881 [2024-11-28 07:32:24.303325] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:13.881 [2024-11-28 07:32:24.303381] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659364 ] 00:07:13.881 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.881 [2024-11-28 07:32:24.364228] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.881 [2024-11-28 07:32:24.398751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=0x1 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=software 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=32 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=32 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=1 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val=No 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:13.881 07:32:24 -- accel/accel.sh@21 -- # val= 00:07:13.881 07:32:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # IFS=: 00:07:13.881 07:32:24 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.816 07:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.816 07:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.816 07:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.816 07:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.816 07:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@21 -- # val= 00:07:14.816 07:32:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # IFS=: 00:07:14.816 07:32:25 -- accel/accel.sh@20 -- # read -r var val 00:07:14.816 07:32:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.816 07:32:25 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:14.816 07:32:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.816 00:07:14.816 real 0m2.529s 00:07:14.816 user 0m2.295s 00:07:14.816 sys 0m0.233s 00:07:14.816 07:32:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:14.816 07:32:25 -- common/autotest_common.sh@10 -- # set +x 00:07:14.816 ************************************ 00:07:14.816 END TEST accel_dif_generate_copy 00:07:14.816 ************************************ 00:07:15.075 07:32:25 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:15.075 07:32:25 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.075 07:32:25 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:15.075 07:32:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.075 07:32:25 -- common/autotest_common.sh@10 -- # set +x 00:07:15.075 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:15.075 ************************************ 00:07:15.075 START TEST accel_comp 00:07:15.075 ************************************ 00:07:15.075 07:32:25 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.075 07:32:25 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.075 07:32:25 -- accel/accel.sh@17 -- # local accel_module 00:07:15.075 07:32:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.075 07:32:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.075 07:32:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.075 07:32:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.075 07:32:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.075 07:32:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.075 07:32:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.075 07:32:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.075 07:32:25 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.075 07:32:25 -- accel/accel.sh@42 -- # jq -r . 00:07:15.075 [2024-11-28 07:32:25.616730] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:15.075 [2024-11-28 07:32:25.616820] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659398 ] 00:07:15.075 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.075 [2024-11-28 07:32:25.683445] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.075 [2024-11-28 07:32:25.719133] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.450 07:32:26 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:16.451 00:07:16.451 SPDK Configuration: 00:07:16.451 Core mask: 0x1 00:07:16.451 00:07:16.451 Accel Perf Configuration: 00:07:16.451 Workload Type: compress 00:07:16.451 Transfer size: 4096 bytes 00:07:16.451 Vector count 1 00:07:16.451 Module: software 00:07:16.451 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.451 Queue depth: 32 00:07:16.451 Allocate depth: 32 00:07:16.451 # threads/core: 1 00:07:16.451 Run time: 1 seconds 00:07:16.451 Verify: No 00:07:16.451 00:07:16.451 Running for 1 seconds... 00:07:16.451 00:07:16.451 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.451 ------------------------------------------------------------------------------------ 00:07:16.451 0,0 68320/s 284 MiB/s 0 0 00:07:16.451 ==================================================================================== 00:07:16.451 Total 68320/s 266 MiB/s 0 0' 00:07:16.451 07:32:26 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.451 07:32:26 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.451 07:32:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.451 07:32:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.451 07:32:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.451 07:32:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.451 07:32:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.451 07:32:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.451 07:32:26 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.451 07:32:26 -- accel/accel.sh@42 -- # jq -r . 00:07:16.451 [2024-11-28 07:32:26.897828] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:16.451 [2024-11-28 07:32:26.897921] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659419 ] 00:07:16.451 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.451 [2024-11-28 07:32:26.965217] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.451 [2024-11-28 07:32:26.999379] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=0x1 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=compress 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=software 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=32 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=32 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=1 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val=No 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:16.451 07:32:27 -- accel/accel.sh@21 -- # val= 00:07:16.451 07:32:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # IFS=: 00:07:16.451 07:32:27 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.827 07:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.827 07:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.827 07:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.827 07:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.827 07:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@21 -- # val= 00:07:17.827 07:32:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # IFS=: 00:07:17.827 07:32:28 -- accel/accel.sh@20 -- # read -r var val 00:07:17.827 07:32:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.827 07:32:28 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:17.827 07:32:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.827 00:07:17.827 real 0m2.564s 00:07:17.827 user 0m2.298s 00:07:17.827 sys 0m0.266s 00:07:17.827 07:32:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:17.827 07:32:28 -- common/autotest_common.sh@10 -- # set +x 00:07:17.827 ************************************ 00:07:17.827 END TEST accel_comp 00:07:17.827 ************************************ 00:07:17.827 07:32:28 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.827 07:32:28 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:17.827 07:32:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:17.827 07:32:28 -- common/autotest_common.sh@10 -- # set +x 00:07:17.827 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:17.827 ************************************ 00:07:17.827 START TEST accel_decomp 00:07:17.827 ************************************ 00:07:17.827 07:32:28 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.827 07:32:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.827 07:32:28 -- accel/accel.sh@17 -- # local accel_module 00:07:17.827 07:32:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.827 07:32:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:17.827 07:32:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.827 07:32:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.827 07:32:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.827 07:32:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.827 07:32:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.827 07:32:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.827 07:32:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.827 07:32:28 -- accel/accel.sh@42 -- # jq -r . 00:07:17.827 [2024-11-28 07:32:28.218108] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:17.827 [2024-11-28 07:32:28.218220] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659454 ] 00:07:17.827 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.827 [2024-11-28 07:32:28.285489] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.827 [2024-11-28 07:32:28.320834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.762 07:32:29 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:18.762 00:07:18.762 SPDK Configuration: 00:07:18.762 Core mask: 0x1 00:07:18.762 00:07:18.762 Accel Perf Configuration: 00:07:18.762 Workload Type: decompress 00:07:18.762 Transfer size: 4096 bytes 00:07:18.763 Vector count 1 00:07:18.763 Module: software 00:07:18.763 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.763 Queue depth: 32 00:07:18.763 Allocate depth: 32 00:07:18.763 # threads/core: 1 00:07:18.763 Run time: 1 seconds 00:07:18.763 Verify: Yes 00:07:18.763 00:07:18.763 Running for 1 seconds... 00:07:18.763 00:07:18.763 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:18.763 ------------------------------------------------------------------------------------ 00:07:18.763 0,0 94656/s 174 MiB/s 0 0 00:07:18.763 ==================================================================================== 00:07:18.763 Total 94656/s 369 MiB/s 0 0' 00:07:18.763 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:18.763 07:32:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:18.763 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:18.763 07:32:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:18.763 07:32:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.763 07:32:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.763 07:32:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.763 07:32:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.763 07:32:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.763 07:32:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.763 07:32:29 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.763 07:32:29 -- accel/accel.sh@42 -- # jq -r . 00:07:18.763 [2024-11-28 07:32:29.489769] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:18.763 [2024-11-28 07:32:29.489824] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659481 ] 00:07:18.763 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.022 [2024-11-28 07:32:29.550730] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.022 [2024-11-28 07:32:29.585002] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=0x1 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=decompress 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=software 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@23 -- # accel_module=software 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=32 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=32 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=1 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val=Yes 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:19.022 07:32:29 -- accel/accel.sh@21 -- # val= 00:07:19.022 07:32:29 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # IFS=: 00:07:19.022 07:32:29 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@21 -- # val= 00:07:20.398 07:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@21 -- # val= 00:07:20.398 07:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@21 -- # val= 00:07:20.398 07:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@21 -- # val= 00:07:20.398 07:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@21 -- # val= 00:07:20.398 07:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@21 -- # val= 00:07:20.398 07:32:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # IFS=: 00:07:20.398 07:32:30 -- accel/accel.sh@20 -- # read -r var val 00:07:20.398 07:32:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:20.398 07:32:30 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:20.398 07:32:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.398 00:07:20.398 real 0m2.550s 00:07:20.398 user 0m2.299s 00:07:20.398 sys 0m0.250s 00:07:20.399 07:32:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:20.399 07:32:30 -- common/autotest_common.sh@10 -- # set +x 00:07:20.399 ************************************ 00:07:20.399 END TEST accel_decomp 00:07:20.399 ************************************ 00:07:20.399 07:32:30 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.399 07:32:30 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:20.399 07:32:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:20.399 07:32:30 -- common/autotest_common.sh@10 -- # set +x 00:07:20.399 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:20.399 ************************************ 00:07:20.399 START TEST accel_decmop_full 00:07:20.399 ************************************ 00:07:20.399 07:32:30 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.399 07:32:30 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.399 07:32:30 -- accel/accel.sh@17 -- # local accel_module 00:07:20.399 07:32:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.399 07:32:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.399 07:32:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:20.399 07:32:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.399 07:32:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.399 07:32:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.399 07:32:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.399 07:32:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.399 07:32:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.399 07:32:30 -- accel/accel.sh@42 -- # jq -r . 00:07:20.399 [2024-11-28 07:32:30.804827] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:20.399 [2024-11-28 07:32:30.804918] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659515 ] 00:07:20.399 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.399 [2024-11-28 07:32:30.871713] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.399 [2024-11-28 07:32:30.907115] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.334 07:32:32 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:21.334 00:07:21.334 SPDK Configuration: 00:07:21.334 Core mask: 0x1 00:07:21.334 00:07:21.334 Accel Perf Configuration: 00:07:21.334 Workload Type: decompress 00:07:21.334 Transfer size: 111250 bytes 00:07:21.334 Vector count 1 00:07:21.334 Module: software 00:07:21.334 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.334 Queue depth: 32 00:07:21.334 Allocate depth: 32 00:07:21.334 # threads/core: 1 00:07:21.334 Run time: 1 seconds 00:07:21.334 Verify: Yes 00:07:21.334 00:07:21.334 Running for 1 seconds... 00:07:21.334 00:07:21.334 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:21.334 ------------------------------------------------------------------------------------ 00:07:21.334 0,0 5952/s 245 MiB/s 0 0 00:07:21.334 ==================================================================================== 00:07:21.334 Total 5952/s 631 MiB/s 0 0' 00:07:21.334 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.334 07:32:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.334 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.334 07:32:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.334 07:32:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.334 07:32:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.334 07:32:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.334 07:32:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.334 07:32:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.334 07:32:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.334 07:32:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.334 07:32:32 -- accel/accel.sh@42 -- # jq -r . 00:07:21.334 [2024-11-28 07:32:32.093300] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:21.334 [2024-11-28 07:32:32.093392] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659536 ] 00:07:21.593 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.593 [2024-11-28 07:32:32.160921] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.593 [2024-11-28 07:32:32.195542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=0x1 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=decompress 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=software 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=32 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=32 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=1 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:21.593 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.593 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.593 07:32:32 -- accel/accel.sh@21 -- # val=Yes 00:07:21.594 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.594 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.594 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.594 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.594 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.594 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.594 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:21.594 07:32:32 -- accel/accel.sh@21 -- # val= 00:07:21.594 07:32:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.594 07:32:32 -- accel/accel.sh@20 -- # IFS=: 00:07:21.594 07:32:32 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.971 07:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.971 07:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.971 07:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.971 07:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.971 07:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@21 -- # val= 00:07:22.971 07:32:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # IFS=: 00:07:22.971 07:32:33 -- accel/accel.sh@20 -- # read -r var val 00:07:22.971 07:32:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:22.971 07:32:33 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:22.971 07:32:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.971 00:07:22.971 real 0m2.582s 00:07:22.971 user 0m2.321s 00:07:22.971 sys 0m0.258s 00:07:22.971 07:32:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:22.971 07:32:33 -- common/autotest_common.sh@10 -- # set +x 00:07:22.971 ************************************ 00:07:22.971 END TEST accel_decmop_full 00:07:22.971 ************************************ 00:07:22.971 07:32:33 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:22.971 07:32:33 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:22.971 07:32:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:22.971 07:32:33 -- common/autotest_common.sh@10 -- # set +x 00:07:22.971 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:22.971 ************************************ 00:07:22.971 START TEST accel_decomp_mcore 00:07:22.971 ************************************ 00:07:22.971 07:32:33 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:22.971 07:32:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:22.971 07:32:33 -- accel/accel.sh@17 -- # local accel_module 00:07:22.972 07:32:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:22.972 07:32:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:22.972 07:32:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.972 07:32:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.972 07:32:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.972 07:32:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.972 07:32:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.972 07:32:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.972 07:32:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.972 07:32:33 -- accel/accel.sh@42 -- # jq -r . 00:07:22.972 [2024-11-28 07:32:33.428049] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:22.972 [2024-11-28 07:32:33.428139] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659580 ] 00:07:22.972 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.972 [2024-11-28 07:32:33.502184] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:22.972 [2024-11-28 07:32:33.541226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.972 [2024-11-28 07:32:33.541245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.972 [2024-11-28 07:32:33.541311] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:22.972 [2024-11-28 07:32:33.541313] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.351 07:32:34 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:24.351 00:07:24.351 SPDK Configuration: 00:07:24.351 Core mask: 0xf 00:07:24.351 00:07:24.351 Accel Perf Configuration: 00:07:24.351 Workload Type: decompress 00:07:24.351 Transfer size: 4096 bytes 00:07:24.351 Vector count 1 00:07:24.351 Module: software 00:07:24.351 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.351 Queue depth: 32 00:07:24.351 Allocate depth: 32 00:07:24.351 # threads/core: 1 00:07:24.351 Run time: 1 seconds 00:07:24.351 Verify: Yes 00:07:24.351 00:07:24.351 Running for 1 seconds... 00:07:24.351 00:07:24.351 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.351 ------------------------------------------------------------------------------------ 00:07:24.351 0,0 78528/s 144 MiB/s 0 0 00:07:24.351 3,0 78720/s 145 MiB/s 0 0 00:07:24.351 2,0 78336/s 144 MiB/s 0 0 00:07:24.351 1,0 78592/s 144 MiB/s 0 0 00:07:24.351 ==================================================================================== 00:07:24.351 Total 314176/s 1227 MiB/s 0 0' 00:07:24.351 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.351 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.351 07:32:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:24.351 07:32:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:24.351 07:32:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.351 07:32:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.351 07:32:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.351 07:32:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.351 07:32:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.351 07:32:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.351 07:32:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.351 07:32:34 -- accel/accel.sh@42 -- # jq -r . 00:07:24.351 [2024-11-28 07:32:34.730626] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:24.352 [2024-11-28 07:32:34.730706] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659603 ] 00:07:24.352 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.352 [2024-11-28 07:32:34.798484] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:24.352 [2024-11-28 07:32:34.835035] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.352 [2024-11-28 07:32:34.835136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.352 [2024-11-28 07:32:34.835222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:24.352 [2024-11-28 07:32:34.835224] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=0xf 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=decompress 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=software 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=32 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=32 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=1 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val=Yes 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:24.352 07:32:34 -- accel/accel.sh@21 -- # val= 00:07:24.352 07:32:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # IFS=: 00:07:24.352 07:32:34 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@21 -- # val= 00:07:25.290 07:32:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # IFS=: 00:07:25.290 07:32:36 -- accel/accel.sh@20 -- # read -r var val 00:07:25.290 07:32:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.290 07:32:36 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:25.290 07:32:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.290 00:07:25.290 real 0m2.606s 00:07:25.290 user 0m9.002s 00:07:25.290 sys 0m0.265s 00:07:25.290 07:32:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:25.290 07:32:36 -- common/autotest_common.sh@10 -- # set +x 00:07:25.290 ************************************ 00:07:25.290 END TEST accel_decomp_mcore 00:07:25.290 ************************************ 00:07:25.290 07:32:36 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.290 07:32:36 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:25.290 07:32:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:25.290 07:32:36 -- common/autotest_common.sh@10 -- # set +x 00:07:25.290 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:25.549 ************************************ 00:07:25.549 START TEST accel_decomp_full_mcore 00:07:25.549 ************************************ 00:07:25.549 07:32:36 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.549 07:32:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.549 07:32:36 -- accel/accel.sh@17 -- # local accel_module 00:07:25.549 07:32:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.549 07:32:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:25.549 07:32:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.549 07:32:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.549 07:32:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.549 07:32:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.549 07:32:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.549 07:32:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.549 07:32:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.549 07:32:36 -- accel/accel.sh@42 -- # jq -r . 00:07:25.549 [2024-11-28 07:32:36.081853] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:25.549 [2024-11-28 07:32:36.081951] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659646 ] 00:07:25.549 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.549 [2024-11-28 07:32:36.150283] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:25.549 [2024-11-28 07:32:36.187996] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.549 [2024-11-28 07:32:36.188092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:25.549 [2024-11-28 07:32:36.188178] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:25.549 [2024-11-28 07:32:36.188179] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.928 07:32:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:26.928 00:07:26.928 SPDK Configuration: 00:07:26.928 Core mask: 0xf 00:07:26.928 00:07:26.928 Accel Perf Configuration: 00:07:26.928 Workload Type: decompress 00:07:26.928 Transfer size: 111250 bytes 00:07:26.928 Vector count 1 00:07:26.928 Module: software 00:07:26.928 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.928 Queue depth: 32 00:07:26.928 Allocate depth: 32 00:07:26.928 # threads/core: 1 00:07:26.928 Run time: 1 seconds 00:07:26.928 Verify: Yes 00:07:26.928 00:07:26.928 Running for 1 seconds... 00:07:26.928 00:07:26.928 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:26.928 ------------------------------------------------------------------------------------ 00:07:26.928 0,0 5792/s 239 MiB/s 0 0 00:07:26.928 3,0 5824/s 240 MiB/s 0 0 00:07:26.928 2,0 5824/s 240 MiB/s 0 0 00:07:26.928 1,0 5824/s 240 MiB/s 0 0 00:07:26.928 ==================================================================================== 00:07:26.928 Total 23264/s 2468 MiB/s 0 0' 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:26.928 07:32:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:26.928 07:32:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.928 07:32:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.928 07:32:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.928 07:32:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.928 07:32:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.928 07:32:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.928 07:32:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.928 07:32:37 -- accel/accel.sh@42 -- # jq -r . 00:07:26.928 [2024-11-28 07:32:37.384228] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:26.928 [2024-11-28 07:32:37.384323] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659668 ] 00:07:26.928 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.928 [2024-11-28 07:32:37.450996] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:26.928 [2024-11-28 07:32:37.487443] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.928 [2024-11-28 07:32:37.487538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:26.928 [2024-11-28 07:32:37.487624] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:26.928 [2024-11-28 07:32:37.487641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val=0xf 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val=decompress 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.928 07:32:37 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.928 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.928 07:32:37 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:26.928 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val=software 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@23 -- # accel_module=software 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val=32 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val=32 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val=1 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val=Yes 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:26.929 07:32:37 -- accel/accel.sh@21 -- # val= 00:07:26.929 07:32:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # IFS=: 00:07:26.929 07:32:37 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@21 -- # val= 00:07:28.311 07:32:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # IFS=: 00:07:28.311 07:32:38 -- accel/accel.sh@20 -- # read -r var val 00:07:28.311 07:32:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:28.311 07:32:38 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:28.311 07:32:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.311 00:07:28.311 real 0m2.610s 00:07:28.311 user 0m9.036s 00:07:28.311 sys 0m0.279s 00:07:28.311 07:32:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:28.311 07:32:38 -- common/autotest_common.sh@10 -- # set +x 00:07:28.311 ************************************ 00:07:28.311 END TEST accel_decomp_full_mcore 00:07:28.311 ************************************ 00:07:28.311 07:32:38 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:28.311 07:32:38 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:28.311 07:32:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:28.311 07:32:38 -- common/autotest_common.sh@10 -- # set +x 00:07:28.311 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:28.311 ************************************ 00:07:28.311 START TEST accel_decomp_mthread 00:07:28.311 ************************************ 00:07:28.311 07:32:38 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:28.311 07:32:38 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.311 07:32:38 -- accel/accel.sh@17 -- # local accel_module 00:07:28.311 07:32:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:28.311 07:32:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:28.311 07:32:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.311 07:32:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.311 07:32:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.311 07:32:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.311 07:32:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.311 07:32:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.311 07:32:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.311 07:32:38 -- accel/accel.sh@42 -- # jq -r . 00:07:28.311 [2024-11-28 07:32:38.733740] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:28.311 [2024-11-28 07:32:38.733808] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659713 ] 00:07:28.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.311 [2024-11-28 07:32:38.797917] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.311 [2024-11-28 07:32:38.832650] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.250 07:32:39 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:29.250 00:07:29.250 SPDK Configuration: 00:07:29.250 Core mask: 0x1 00:07:29.250 00:07:29.250 Accel Perf Configuration: 00:07:29.250 Workload Type: decompress 00:07:29.250 Transfer size: 4096 bytes 00:07:29.250 Vector count 1 00:07:29.250 Module: software 00:07:29.250 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.250 Queue depth: 32 00:07:29.250 Allocate depth: 32 00:07:29.250 # threads/core: 2 00:07:29.250 Run time: 1 seconds 00:07:29.250 Verify: Yes 00:07:29.250 00:07:29.250 Running for 1 seconds... 00:07:29.250 00:07:29.250 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:29.250 ------------------------------------------------------------------------------------ 00:07:29.250 0,1 47968/s 88 MiB/s 0 0 00:07:29.250 0,0 47872/s 88 MiB/s 0 0 00:07:29.250 ==================================================================================== 00:07:29.250 Total 95840/s 374 MiB/s 0 0' 00:07:29.250 07:32:39 -- accel/accel.sh@20 -- # IFS=: 00:07:29.250 07:32:39 -- accel/accel.sh@20 -- # read -r var val 00:07:29.250 07:32:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:29.250 07:32:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:29.250 07:32:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.250 07:32:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.250 07:32:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.250 07:32:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.250 07:32:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.250 07:32:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.250 07:32:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.250 07:32:39 -- accel/accel.sh@42 -- # jq -r . 00:07:29.250 [2024-11-28 07:32:40.014980] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:29.250 [2024-11-28 07:32:40.015072] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659732 ] 00:07:29.509 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.509 [2024-11-28 07:32:40.084933] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.509 [2024-11-28 07:32:40.122122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val=0x1 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val=decompress 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.509 07:32:40 -- accel/accel.sh@21 -- # val=software 00:07:29.509 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.509 07:32:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:29.509 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val=32 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val=32 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val=2 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val=Yes 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:29.510 07:32:40 -- accel/accel.sh@21 -- # val= 00:07:29.510 07:32:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # IFS=: 00:07:29.510 07:32:40 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@21 -- # val= 00:07:30.889 07:32:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # IFS=: 00:07:30.889 07:32:41 -- accel/accel.sh@20 -- # read -r var val 00:07:30.889 07:32:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:30.889 07:32:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:30.889 07:32:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:30.889 00:07:30.889 real 0m2.572s 00:07:30.889 user 0m2.329s 00:07:30.889 sys 0m0.255s 00:07:30.889 07:32:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:30.889 07:32:41 -- common/autotest_common.sh@10 -- # set +x 00:07:30.889 ************************************ 00:07:30.889 END TEST accel_decomp_mthread 00:07:30.889 ************************************ 00:07:30.889 07:32:41 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:30.889 07:32:41 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:30.889 07:32:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.889 07:32:41 -- common/autotest_common.sh@10 -- # set +x 00:07:30.889 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:30.889 ************************************ 00:07:30.889 START TEST accel_deomp_full_mthread 00:07:30.889 ************************************ 00:07:30.889 07:32:41 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:30.889 07:32:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:30.889 07:32:41 -- accel/accel.sh@17 -- # local accel_module 00:07:30.889 07:32:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:30.889 07:32:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:30.889 07:32:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:30.889 07:32:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:30.889 07:32:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.889 07:32:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.889 07:32:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:30.889 07:32:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:30.889 07:32:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:30.889 07:32:41 -- accel/accel.sh@42 -- # jq -r . 00:07:30.889 [2024-11-28 07:32:41.362934] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:30.889 [2024-11-28 07:32:41.363019] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659772 ] 00:07:30.889 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.890 [2024-11-28 07:32:41.430284] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.890 [2024-11-28 07:32:41.465017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.269 07:32:42 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:32.269 00:07:32.269 SPDK Configuration: 00:07:32.269 Core mask: 0x1 00:07:32.269 00:07:32.269 Accel Perf Configuration: 00:07:32.269 Workload Type: decompress 00:07:32.270 Transfer size: 111250 bytes 00:07:32.270 Vector count 1 00:07:32.270 Module: software 00:07:32.270 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:32.270 Queue depth: 32 00:07:32.270 Allocate depth: 32 00:07:32.270 # threads/core: 2 00:07:32.270 Run time: 1 seconds 00:07:32.270 Verify: Yes 00:07:32.270 00:07:32.270 Running for 1 seconds... 00:07:32.270 00:07:32.270 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:32.270 ------------------------------------------------------------------------------------ 00:07:32.270 0,1 3040/s 125 MiB/s 0 0 00:07:32.270 0,0 3008/s 124 MiB/s 0 0 00:07:32.270 ==================================================================================== 00:07:32.270 Total 6048/s 641 MiB/s 0 0' 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:32.270 07:32:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:32.270 07:32:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:32.270 07:32:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:32.270 07:32:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:32.270 07:32:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:32.270 07:32:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:32.270 07:32:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:32.270 07:32:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:32.270 07:32:42 -- accel/accel.sh@42 -- # jq -r . 00:07:32.270 [2024-11-28 07:32:42.667724] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:32.270 [2024-11-28 07:32:42.667815] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659798 ] 00:07:32.270 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.270 [2024-11-28 07:32:42.735758] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.270 [2024-11-28 07:32:42.769492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=0x1 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=decompress 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=software 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@23 -- # accel_module=software 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=32 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=32 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=2 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val=Yes 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:32.270 07:32:42 -- accel/accel.sh@21 -- # val= 00:07:32.270 07:32:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # IFS=: 00:07:32.270 07:32:42 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@21 -- # val= 00:07:33.208 07:32:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # IFS=: 00:07:33.208 07:32:43 -- accel/accel.sh@20 -- # read -r var val 00:07:33.208 07:32:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:33.208 07:32:43 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:33.208 07:32:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:33.209 00:07:33.209 real 0m2.615s 00:07:33.209 user 0m2.367s 00:07:33.209 sys 0m0.257s 00:07:33.209 07:32:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:33.209 07:32:43 -- common/autotest_common.sh@10 -- # set +x 00:07:33.209 ************************************ 00:07:33.209 END TEST accel_deomp_full_mthread 00:07:33.209 ************************************ 00:07:33.468 07:32:43 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:33.468 07:32:43 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:33.468 07:32:43 -- accel/accel.sh@129 -- # build_accel_config 00:07:33.468 07:32:43 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:33.468 07:32:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:33.468 07:32:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:33.468 07:32:43 -- common/autotest_common.sh@10 -- # set +x 00:07:33.468 07:32:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.468 07:32:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.468 07:32:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:33.469 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:33.469 07:32:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:33.469 07:32:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:33.469 07:32:43 -- accel/accel.sh@42 -- # jq -r . 00:07:33.469 ************************************ 00:07:33.469 START TEST accel_dif_functional_tests 00:07:33.469 ************************************ 00:07:33.469 07:32:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:33.469 [2024-11-28 07:32:44.026186] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:33.469 [2024-11-28 07:32:44.026276] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659835 ] 00:07:33.469 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.469 [2024-11-28 07:32:44.093722] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.469 [2024-11-28 07:32:44.130552] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.469 [2024-11-28 07:32:44.130638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.469 [2024-11-28 07:32:44.130635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.469 00:07:33.469 00:07:33.469 CUnit - A unit testing framework for C - Version 2.1-3 00:07:33.469 http://cunit.sourceforge.net/ 00:07:33.469 00:07:33.469 00:07:33.469 Suite: accel_dif 00:07:33.469 Test: verify: DIF generated, GUARD check ...passed 00:07:33.469 Test: verify: DIF generated, APPTAG check ...passed 00:07:33.469 Test: verify: DIF generated, REFTAG check ...passed 00:07:33.469 Test: verify: DIF not generated, GUARD check ...[2024-11-28 07:32:44.194018] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:33.469 [2024-11-28 07:32:44.194071] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:33.469 passed 00:07:33.469 Test: verify: DIF not generated, APPTAG check ...[2024-11-28 07:32:44.194105] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:33.469 [2024-11-28 07:32:44.194124] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:33.469 passed 00:07:33.469 Test: verify: DIF not generated, REFTAG check ...[2024-11-28 07:32:44.194146] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:33.469 [2024-11-28 07:32:44.194164] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:33.469 passed 00:07:33.469 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:33.469 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-28 07:32:44.194206] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:33.469 passed 00:07:33.469 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:33.469 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:33.469 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:33.469 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-28 07:32:44.194307] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:33.469 passed 00:07:33.469 Test: generate copy: DIF generated, GUARD check ...passed 00:07:33.469 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:33.469 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:33.469 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:33.469 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:33.469 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:33.469 Test: generate copy: iovecs-len validate ...[2024-11-28 07:32:44.194486] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:33.469 passed 00:07:33.469 Test: generate copy: buffer alignment validate ...passed 00:07:33.469 00:07:33.469 Run Summary: Type Total Ran Passed Failed Inactive 00:07:33.469 suites 1 1 n/a 0 0 00:07:33.469 tests 20 20 20 0 0 00:07:33.469 asserts 204 204 204 0 n/a 00:07:33.469 00:07:33.469 Elapsed time = 0.000 seconds 00:07:33.729 00:07:33.729 real 0m0.339s 00:07:33.729 user 0m0.517s 00:07:33.729 sys 0m0.162s 00:07:33.729 07:32:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:33.729 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:33.729 ************************************ 00:07:33.729 END TEST accel_dif_functional_tests 00:07:33.729 ************************************ 00:07:33.729 00:07:33.729 real 0m54.925s 00:07:33.729 user 1m26.491s 00:07:33.729 sys 1m2.798s 00:07:33.729 07:32:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:33.729 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:33.729 ************************************ 00:07:33.729 END TEST accel 00:07:33.729 ************************************ 00:07:33.729 07:32:44 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:33.729 07:32:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:33.729 07:32:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:33.729 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:33.729 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:33.729 ************************************ 00:07:33.729 START TEST accel_rpc 00:07:33.729 ************************************ 00:07:33.729 07:32:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:33.989 * Looking for test storage... 00:07:33.989 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:33.989 07:32:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:33.989 07:32:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:33.989 07:32:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:33.989 07:32:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:33.989 07:32:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:33.989 07:32:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:33.989 07:32:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:33.989 07:32:44 -- scripts/common.sh@335 -- # IFS=.-: 00:07:33.989 07:32:44 -- scripts/common.sh@335 -- # read -ra ver1 00:07:33.989 07:32:44 -- scripts/common.sh@336 -- # IFS=.-: 00:07:33.989 07:32:44 -- scripts/common.sh@336 -- # read -ra ver2 00:07:33.989 07:32:44 -- scripts/common.sh@337 -- # local 'op=<' 00:07:33.989 07:32:44 -- scripts/common.sh@339 -- # ver1_l=2 00:07:33.989 07:32:44 -- scripts/common.sh@340 -- # ver2_l=1 00:07:33.989 07:32:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:33.989 07:32:44 -- scripts/common.sh@343 -- # case "$op" in 00:07:33.989 07:32:44 -- scripts/common.sh@344 -- # : 1 00:07:33.989 07:32:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:33.989 07:32:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:33.989 07:32:44 -- scripts/common.sh@364 -- # decimal 1 00:07:33.989 07:32:44 -- scripts/common.sh@352 -- # local d=1 00:07:33.989 07:32:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:33.989 07:32:44 -- scripts/common.sh@354 -- # echo 1 00:07:33.989 07:32:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:33.989 07:32:44 -- scripts/common.sh@365 -- # decimal 2 00:07:33.989 07:32:44 -- scripts/common.sh@352 -- # local d=2 00:07:33.989 07:32:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:33.989 07:32:44 -- scripts/common.sh@354 -- # echo 2 00:07:33.989 07:32:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:33.989 07:32:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:33.989 07:32:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:33.989 07:32:44 -- scripts/common.sh@367 -- # return 0 00:07:33.989 07:32:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:33.989 07:32:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:33.989 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.989 --rc genhtml_branch_coverage=1 00:07:33.989 --rc genhtml_function_coverage=1 00:07:33.989 --rc genhtml_legend=1 00:07:33.989 --rc geninfo_all_blocks=1 00:07:33.989 --rc geninfo_unexecuted_blocks=1 00:07:33.989 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.989 ' 00:07:33.989 07:32:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:33.989 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.989 --rc genhtml_branch_coverage=1 00:07:33.989 --rc genhtml_function_coverage=1 00:07:33.989 --rc genhtml_legend=1 00:07:33.989 --rc geninfo_all_blocks=1 00:07:33.989 --rc geninfo_unexecuted_blocks=1 00:07:33.989 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.989 ' 00:07:33.989 07:32:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:33.989 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.989 --rc genhtml_branch_coverage=1 00:07:33.989 --rc genhtml_function_coverage=1 00:07:33.989 --rc genhtml_legend=1 00:07:33.989 --rc geninfo_all_blocks=1 00:07:33.989 --rc geninfo_unexecuted_blocks=1 00:07:33.989 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.989 ' 00:07:33.989 07:32:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:33.989 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.989 --rc genhtml_branch_coverage=1 00:07:33.989 --rc genhtml_function_coverage=1 00:07:33.989 --rc genhtml_legend=1 00:07:33.989 --rc geninfo_all_blocks=1 00:07:33.989 --rc geninfo_unexecuted_blocks=1 00:07:33.989 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.989 ' 00:07:33.989 07:32:44 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:33.989 07:32:44 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1659902 00:07:33.989 07:32:44 -- accel/accel_rpc.sh@15 -- # waitforlisten 1659902 00:07:33.989 07:32:44 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:33.989 07:32:44 -- common/autotest_common.sh@829 -- # '[' -z 1659902 ']' 00:07:33.989 07:32:44 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.989 07:32:44 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:33.989 07:32:44 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.989 07:32:44 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:33.989 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:33.989 [2024-11-28 07:32:44.647332] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:33.989 [2024-11-28 07:32:44.647406] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659902 ] 00:07:33.989 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.989 [2024-11-28 07:32:44.713938] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.989 [2024-11-28 07:32:44.751457] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.989 [2024-11-28 07:32:44.751573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.249 07:32:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:34.249 07:32:44 -- common/autotest_common.sh@862 -- # return 0 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:34.249 07:32:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:34.249 07:32:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.249 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:34.249 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:34.249 ************************************ 00:07:34.249 START TEST accel_assign_opcode 00:07:34.249 ************************************ 00:07:34.249 07:32:44 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:34.249 07:32:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:34.249 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:34.249 [2024-11-28 07:32:44.824066] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:34.249 07:32:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:34.249 07:32:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:34.249 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:34.249 [2024-11-28 07:32:44.832083] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:34.249 07:32:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:34.249 07:32:44 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:34.249 07:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:34.249 07:32:44 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:34.249 07:32:44 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:34.249 07:32:45 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:34.249 07:32:45 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:34.249 07:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.249 07:32:45 -- accel/accel_rpc.sh@42 -- # grep software 00:07:34.249 07:32:45 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:34.507 software 00:07:34.507 00:07:34.507 real 0m0.224s 00:07:34.507 user 0m0.049s 00:07:34.507 sys 0m0.011s 00:07:34.507 07:32:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:34.507 07:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.507 ************************************ 00:07:34.507 END TEST accel_assign_opcode 00:07:34.507 ************************************ 00:07:34.507 07:32:45 -- accel/accel_rpc.sh@55 -- # killprocess 1659902 00:07:34.507 07:32:45 -- common/autotest_common.sh@936 -- # '[' -z 1659902 ']' 00:07:34.507 07:32:45 -- common/autotest_common.sh@940 -- # kill -0 1659902 00:07:34.507 07:32:45 -- common/autotest_common.sh@941 -- # uname 00:07:34.507 07:32:45 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:34.507 07:32:45 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1659902 00:07:34.507 07:32:45 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:34.507 07:32:45 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:34.507 07:32:45 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1659902' 00:07:34.507 killing process with pid 1659902 00:07:34.507 07:32:45 -- common/autotest_common.sh@955 -- # kill 1659902 00:07:34.507 07:32:45 -- common/autotest_common.sh@960 -- # wait 1659902 00:07:34.766 00:07:34.766 real 0m0.990s 00:07:34.766 user 0m0.892s 00:07:34.766 sys 0m0.452s 00:07:34.766 07:32:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:34.766 07:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.766 ************************************ 00:07:34.766 END TEST accel_rpc 00:07:34.766 ************************************ 00:07:34.766 07:32:45 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:34.766 07:32:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:34.766 07:32:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.766 07:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:34.766 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:34.766 ************************************ 00:07:34.766 START TEST app_cmdline 00:07:34.766 ************************************ 00:07:34.766 07:32:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:35.026 * Looking for test storage... 00:07:35.026 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:35.026 07:32:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:35.026 07:32:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:35.026 07:32:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:35.026 07:32:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:35.026 07:32:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:35.026 07:32:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:35.026 07:32:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:35.026 07:32:45 -- scripts/common.sh@335 -- # IFS=.-: 00:07:35.026 07:32:45 -- scripts/common.sh@335 -- # read -ra ver1 00:07:35.026 07:32:45 -- scripts/common.sh@336 -- # IFS=.-: 00:07:35.026 07:32:45 -- scripts/common.sh@336 -- # read -ra ver2 00:07:35.026 07:32:45 -- scripts/common.sh@337 -- # local 'op=<' 00:07:35.026 07:32:45 -- scripts/common.sh@339 -- # ver1_l=2 00:07:35.026 07:32:45 -- scripts/common.sh@340 -- # ver2_l=1 00:07:35.026 07:32:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:35.026 07:32:45 -- scripts/common.sh@343 -- # case "$op" in 00:07:35.026 07:32:45 -- scripts/common.sh@344 -- # : 1 00:07:35.026 07:32:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:35.026 07:32:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:35.026 07:32:45 -- scripts/common.sh@364 -- # decimal 1 00:07:35.026 07:32:45 -- scripts/common.sh@352 -- # local d=1 00:07:35.026 07:32:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:35.026 07:32:45 -- scripts/common.sh@354 -- # echo 1 00:07:35.026 07:32:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:35.026 07:32:45 -- scripts/common.sh@365 -- # decimal 2 00:07:35.026 07:32:45 -- scripts/common.sh@352 -- # local d=2 00:07:35.026 07:32:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:35.026 07:32:45 -- scripts/common.sh@354 -- # echo 2 00:07:35.026 07:32:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:35.026 07:32:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:35.026 07:32:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:35.026 07:32:45 -- scripts/common.sh@367 -- # return 0 00:07:35.026 07:32:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:35.026 07:32:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:35.026 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.026 --rc genhtml_branch_coverage=1 00:07:35.026 --rc genhtml_function_coverage=1 00:07:35.026 --rc genhtml_legend=1 00:07:35.026 --rc geninfo_all_blocks=1 00:07:35.026 --rc geninfo_unexecuted_blocks=1 00:07:35.026 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.026 ' 00:07:35.026 07:32:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:35.026 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.026 --rc genhtml_branch_coverage=1 00:07:35.026 --rc genhtml_function_coverage=1 00:07:35.026 --rc genhtml_legend=1 00:07:35.026 --rc geninfo_all_blocks=1 00:07:35.026 --rc geninfo_unexecuted_blocks=1 00:07:35.026 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.026 ' 00:07:35.026 07:32:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:35.026 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.026 --rc genhtml_branch_coverage=1 00:07:35.026 --rc genhtml_function_coverage=1 00:07:35.026 --rc genhtml_legend=1 00:07:35.026 --rc geninfo_all_blocks=1 00:07:35.026 --rc geninfo_unexecuted_blocks=1 00:07:35.026 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.026 ' 00:07:35.026 07:32:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:35.026 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:35.026 --rc genhtml_branch_coverage=1 00:07:35.026 --rc genhtml_function_coverage=1 00:07:35.026 --rc genhtml_legend=1 00:07:35.026 --rc geninfo_all_blocks=1 00:07:35.026 --rc geninfo_unexecuted_blocks=1 00:07:35.026 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:35.026 ' 00:07:35.026 07:32:45 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:35.026 07:32:45 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1659994 00:07:35.026 07:32:45 -- app/cmdline.sh@18 -- # waitforlisten 1659994 00:07:35.026 07:32:45 -- common/autotest_common.sh@829 -- # '[' -z 1659994 ']' 00:07:35.026 07:32:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.026 07:32:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:35.026 07:32:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.026 07:32:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:35.026 07:32:45 -- common/autotest_common.sh@10 -- # set +x 00:07:35.026 07:32:45 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:35.026 [2024-11-28 07:32:45.675846] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:35.026 [2024-11-28 07:32:45.675912] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1659994 ] 00:07:35.026 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.026 [2024-11-28 07:32:45.742100] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.026 [2024-11-28 07:32:45.779751] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.026 [2024-11-28 07:32:45.779872] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.964 07:32:46 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:35.964 07:32:46 -- common/autotest_common.sh@862 -- # return 0 00:07:35.964 07:32:46 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:35.964 { 00:07:35.964 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:35.964 "fields": { 00:07:35.964 "major": 24, 00:07:35.964 "minor": 1, 00:07:35.964 "patch": 1, 00:07:35.964 "suffix": "-pre", 00:07:35.964 "commit": "c13c99a5e" 00:07:35.964 } 00:07:35.964 } 00:07:35.964 07:32:46 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:35.964 07:32:46 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:35.964 07:32:46 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:35.964 07:32:46 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:35.964 07:32:46 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:35.964 07:32:46 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:35.964 07:32:46 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:35.964 07:32:46 -- common/autotest_common.sh@10 -- # set +x 00:07:35.964 07:32:46 -- app/cmdline.sh@26 -- # sort 00:07:35.964 07:32:46 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:35.964 07:32:46 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:35.964 07:32:46 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:35.964 07:32:46 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.964 07:32:46 -- common/autotest_common.sh@650 -- # local es=0 00:07:35.964 07:32:46 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.964 07:32:46 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.964 07:32:46 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.964 07:32:46 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.964 07:32:46 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.964 07:32:46 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.964 07:32:46 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.964 07:32:46 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.964 07:32:46 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:35.964 07:32:46 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:36.223 request: 00:07:36.223 { 00:07:36.223 "method": "env_dpdk_get_mem_stats", 00:07:36.223 "req_id": 1 00:07:36.223 } 00:07:36.223 Got JSON-RPC error response 00:07:36.223 response: 00:07:36.223 { 00:07:36.223 "code": -32601, 00:07:36.223 "message": "Method not found" 00:07:36.223 } 00:07:36.223 07:32:46 -- common/autotest_common.sh@653 -- # es=1 00:07:36.223 07:32:46 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:36.223 07:32:46 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:36.223 07:32:46 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:36.223 07:32:46 -- app/cmdline.sh@1 -- # killprocess 1659994 00:07:36.223 07:32:46 -- common/autotest_common.sh@936 -- # '[' -z 1659994 ']' 00:07:36.223 07:32:46 -- common/autotest_common.sh@940 -- # kill -0 1659994 00:07:36.223 07:32:46 -- common/autotest_common.sh@941 -- # uname 00:07:36.223 07:32:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:36.223 07:32:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1659994 00:07:36.223 07:32:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:36.223 07:32:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:36.223 07:32:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1659994' 00:07:36.223 killing process with pid 1659994 00:07:36.223 07:32:46 -- common/autotest_common.sh@955 -- # kill 1659994 00:07:36.223 07:32:46 -- common/autotest_common.sh@960 -- # wait 1659994 00:07:36.792 00:07:36.792 real 0m1.786s 00:07:36.792 user 0m2.085s 00:07:36.792 sys 0m0.501s 00:07:36.792 07:32:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:36.792 07:32:47 -- common/autotest_common.sh@10 -- # set +x 00:07:36.792 ************************************ 00:07:36.792 END TEST app_cmdline 00:07:36.792 ************************************ 00:07:36.792 07:32:47 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:36.792 07:32:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:36.792 07:32:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:36.793 07:32:47 -- common/autotest_common.sh@10 -- # set +x 00:07:36.793 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:36.793 ************************************ 00:07:36.793 START TEST version 00:07:36.793 ************************************ 00:07:36.793 07:32:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:36.793 * Looking for test storage... 00:07:36.793 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:36.793 07:32:47 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:36.793 07:32:47 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:36.793 07:32:47 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:36.793 07:32:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:36.793 07:32:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:36.793 07:32:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:36.793 07:32:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:36.793 07:32:47 -- scripts/common.sh@335 -- # IFS=.-: 00:07:36.793 07:32:47 -- scripts/common.sh@335 -- # read -ra ver1 00:07:36.793 07:32:47 -- scripts/common.sh@336 -- # IFS=.-: 00:07:36.793 07:32:47 -- scripts/common.sh@336 -- # read -ra ver2 00:07:36.793 07:32:47 -- scripts/common.sh@337 -- # local 'op=<' 00:07:36.793 07:32:47 -- scripts/common.sh@339 -- # ver1_l=2 00:07:36.793 07:32:47 -- scripts/common.sh@340 -- # ver2_l=1 00:07:36.793 07:32:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:36.793 07:32:47 -- scripts/common.sh@343 -- # case "$op" in 00:07:36.793 07:32:47 -- scripts/common.sh@344 -- # : 1 00:07:36.793 07:32:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:36.793 07:32:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:36.793 07:32:47 -- scripts/common.sh@364 -- # decimal 1 00:07:36.793 07:32:47 -- scripts/common.sh@352 -- # local d=1 00:07:36.793 07:32:47 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:36.793 07:32:47 -- scripts/common.sh@354 -- # echo 1 00:07:36.793 07:32:47 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:36.793 07:32:47 -- scripts/common.sh@365 -- # decimal 2 00:07:36.793 07:32:47 -- scripts/common.sh@352 -- # local d=2 00:07:36.793 07:32:47 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:36.793 07:32:47 -- scripts/common.sh@354 -- # echo 2 00:07:36.793 07:32:47 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:36.793 07:32:47 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:36.793 07:32:47 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:36.793 07:32:47 -- scripts/common.sh@367 -- # return 0 00:07:36.793 07:32:47 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:36.793 07:32:47 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:36.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.793 --rc genhtml_branch_coverage=1 00:07:36.793 --rc genhtml_function_coverage=1 00:07:36.793 --rc genhtml_legend=1 00:07:36.793 --rc geninfo_all_blocks=1 00:07:36.793 --rc geninfo_unexecuted_blocks=1 00:07:36.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.793 ' 00:07:36.793 07:32:47 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:36.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.793 --rc genhtml_branch_coverage=1 00:07:36.793 --rc genhtml_function_coverage=1 00:07:36.793 --rc genhtml_legend=1 00:07:36.793 --rc geninfo_all_blocks=1 00:07:36.793 --rc geninfo_unexecuted_blocks=1 00:07:36.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.793 ' 00:07:36.793 07:32:47 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:36.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.793 --rc genhtml_branch_coverage=1 00:07:36.793 --rc genhtml_function_coverage=1 00:07:36.793 --rc genhtml_legend=1 00:07:36.793 --rc geninfo_all_blocks=1 00:07:36.793 --rc geninfo_unexecuted_blocks=1 00:07:36.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.793 ' 00:07:36.793 07:32:47 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:36.793 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.793 --rc genhtml_branch_coverage=1 00:07:36.793 --rc genhtml_function_coverage=1 00:07:36.793 --rc genhtml_legend=1 00:07:36.793 --rc geninfo_all_blocks=1 00:07:36.793 --rc geninfo_unexecuted_blocks=1 00:07:36.793 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.793 ' 00:07:36.793 07:32:47 -- app/version.sh@17 -- # get_header_version major 00:07:36.793 07:32:47 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.793 07:32:47 -- app/version.sh@14 -- # cut -f2 00:07:36.793 07:32:47 -- app/version.sh@14 -- # tr -d '"' 00:07:36.793 07:32:47 -- app/version.sh@17 -- # major=24 00:07:36.793 07:32:47 -- app/version.sh@18 -- # get_header_version minor 00:07:36.793 07:32:47 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.793 07:32:47 -- app/version.sh@14 -- # cut -f2 00:07:36.793 07:32:47 -- app/version.sh@14 -- # tr -d '"' 00:07:36.793 07:32:47 -- app/version.sh@18 -- # minor=1 00:07:36.793 07:32:47 -- app/version.sh@19 -- # get_header_version patch 00:07:36.793 07:32:47 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.793 07:32:47 -- app/version.sh@14 -- # cut -f2 00:07:36.793 07:32:47 -- app/version.sh@14 -- # tr -d '"' 00:07:36.793 07:32:47 -- app/version.sh@19 -- # patch=1 00:07:36.793 07:32:47 -- app/version.sh@20 -- # get_header_version suffix 00:07:36.793 07:32:47 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.793 07:32:47 -- app/version.sh@14 -- # cut -f2 00:07:36.793 07:32:47 -- app/version.sh@14 -- # tr -d '"' 00:07:36.793 07:32:47 -- app/version.sh@20 -- # suffix=-pre 00:07:36.793 07:32:47 -- app/version.sh@22 -- # version=24.1 00:07:36.793 07:32:47 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:36.793 07:32:47 -- app/version.sh@25 -- # version=24.1.1 00:07:36.793 07:32:47 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:36.793 07:32:47 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:36.793 07:32:47 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:37.052 07:32:47 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:37.052 07:32:47 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:37.052 00:07:37.052 real 0m0.265s 00:07:37.052 user 0m0.154s 00:07:37.052 sys 0m0.164s 00:07:37.052 07:32:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:37.052 07:32:47 -- common/autotest_common.sh@10 -- # set +x 00:07:37.052 ************************************ 00:07:37.052 END TEST version 00:07:37.052 ************************************ 00:07:37.052 07:32:47 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@191 -- # uname -s 00:07:37.052 07:32:47 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:37.052 07:32:47 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:37.052 07:32:47 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:37.052 07:32:47 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:37.052 07:32:47 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:37.052 07:32:47 -- common/autotest_common.sh@10 -- # set +x 00:07:37.052 07:32:47 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:37.052 07:32:47 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:37.052 07:32:47 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:37.052 07:32:47 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:37.052 07:32:47 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:37.052 07:32:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:37.052 07:32:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:37.052 07:32:47 -- common/autotest_common.sh@10 -- # set +x 00:07:37.052 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:37.052 ************************************ 00:07:37.052 START TEST llvm_fuzz 00:07:37.052 ************************************ 00:07:37.052 07:32:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:37.052 * Looking for test storage... 00:07:37.052 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:37.052 07:32:47 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:37.052 07:32:47 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:37.052 07:32:47 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:37.052 07:32:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:37.052 07:32:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:37.052 07:32:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:37.052 07:32:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:37.053 07:32:47 -- scripts/common.sh@335 -- # IFS=.-: 00:07:37.053 07:32:47 -- scripts/common.sh@335 -- # read -ra ver1 00:07:37.053 07:32:47 -- scripts/common.sh@336 -- # IFS=.-: 00:07:37.053 07:32:47 -- scripts/common.sh@336 -- # read -ra ver2 00:07:37.053 07:32:47 -- scripts/common.sh@337 -- # local 'op=<' 00:07:37.053 07:32:47 -- scripts/common.sh@339 -- # ver1_l=2 00:07:37.053 07:32:47 -- scripts/common.sh@340 -- # ver2_l=1 00:07:37.053 07:32:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:37.053 07:32:47 -- scripts/common.sh@343 -- # case "$op" in 00:07:37.053 07:32:47 -- scripts/common.sh@344 -- # : 1 00:07:37.053 07:32:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:37.053 07:32:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:37.053 07:32:47 -- scripts/common.sh@364 -- # decimal 1 00:07:37.053 07:32:47 -- scripts/common.sh@352 -- # local d=1 00:07:37.053 07:32:47 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:37.053 07:32:47 -- scripts/common.sh@354 -- # echo 1 00:07:37.053 07:32:47 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:37.053 07:32:47 -- scripts/common.sh@365 -- # decimal 2 00:07:37.053 07:32:47 -- scripts/common.sh@352 -- # local d=2 00:07:37.053 07:32:47 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:37.053 07:32:47 -- scripts/common.sh@354 -- # echo 2 00:07:37.053 07:32:47 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:37.053 07:32:47 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:37.053 07:32:47 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:37.053 07:32:47 -- scripts/common.sh@367 -- # return 0 00:07:37.053 07:32:47 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:37.053 07:32:47 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:37.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.053 --rc genhtml_branch_coverage=1 00:07:37.053 --rc genhtml_function_coverage=1 00:07:37.053 --rc genhtml_legend=1 00:07:37.053 --rc geninfo_all_blocks=1 00:07:37.053 --rc geninfo_unexecuted_blocks=1 00:07:37.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.053 ' 00:07:37.053 07:32:47 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:37.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.053 --rc genhtml_branch_coverage=1 00:07:37.053 --rc genhtml_function_coverage=1 00:07:37.053 --rc genhtml_legend=1 00:07:37.053 --rc geninfo_all_blocks=1 00:07:37.053 --rc geninfo_unexecuted_blocks=1 00:07:37.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.053 ' 00:07:37.053 07:32:47 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:37.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.053 --rc genhtml_branch_coverage=1 00:07:37.053 --rc genhtml_function_coverage=1 00:07:37.053 --rc genhtml_legend=1 00:07:37.053 --rc geninfo_all_blocks=1 00:07:37.053 --rc geninfo_unexecuted_blocks=1 00:07:37.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.053 ' 00:07:37.053 07:32:47 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:37.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.053 --rc genhtml_branch_coverage=1 00:07:37.053 --rc genhtml_function_coverage=1 00:07:37.053 --rc genhtml_legend=1 00:07:37.053 --rc geninfo_all_blocks=1 00:07:37.053 --rc geninfo_unexecuted_blocks=1 00:07:37.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.053 ' 00:07:37.053 07:32:47 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:37.053 07:32:47 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:37.053 07:32:47 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:37.053 07:32:47 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:37.053 07:32:47 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:37.053 07:32:47 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:37.053 07:32:47 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:37.053 07:32:47 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:37.053 07:32:47 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:37.053 07:32:47 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:37.053 07:32:47 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:37.053 07:32:47 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:37.053 07:32:47 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:37.053 07:32:47 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:37.053 07:32:47 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:37.053 07:32:47 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:37.053 07:32:47 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:37.053 07:32:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:37.053 07:32:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:37.053 07:32:47 -- common/autotest_common.sh@10 -- # set +x 00:07:37.053 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:07:37.313 ************************************ 00:07:37.313 START TEST nvmf_fuzz 00:07:37.313 ************************************ 00:07:37.313 07:32:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:37.313 * Looking for test storage... 00:07:37.313 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.313 07:32:47 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:37.313 07:32:47 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:37.313 07:32:47 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:37.313 07:32:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:37.313 07:32:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:37.313 07:32:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:37.313 07:32:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:37.313 07:32:47 -- scripts/common.sh@335 -- # IFS=.-: 00:07:37.313 07:32:47 -- scripts/common.sh@335 -- # read -ra ver1 00:07:37.313 07:32:47 -- scripts/common.sh@336 -- # IFS=.-: 00:07:37.313 07:32:47 -- scripts/common.sh@336 -- # read -ra ver2 00:07:37.313 07:32:47 -- scripts/common.sh@337 -- # local 'op=<' 00:07:37.313 07:32:47 -- scripts/common.sh@339 -- # ver1_l=2 00:07:37.313 07:32:47 -- scripts/common.sh@340 -- # ver2_l=1 00:07:37.313 07:32:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:37.313 07:32:47 -- scripts/common.sh@343 -- # case "$op" in 00:07:37.313 07:32:47 -- scripts/common.sh@344 -- # : 1 00:07:37.313 07:32:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:37.314 07:32:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:37.314 07:32:47 -- scripts/common.sh@364 -- # decimal 1 00:07:37.314 07:32:48 -- scripts/common.sh@352 -- # local d=1 00:07:37.314 07:32:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:37.314 07:32:48 -- scripts/common.sh@354 -- # echo 1 00:07:37.314 07:32:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:37.314 07:32:48 -- scripts/common.sh@365 -- # decimal 2 00:07:37.314 07:32:48 -- scripts/common.sh@352 -- # local d=2 00:07:37.314 07:32:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:37.314 07:32:48 -- scripts/common.sh@354 -- # echo 2 00:07:37.314 07:32:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:37.314 07:32:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:37.314 07:32:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:37.314 07:32:48 -- scripts/common.sh@367 -- # return 0 00:07:37.314 07:32:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:37.314 07:32:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:37.314 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.314 --rc genhtml_branch_coverage=1 00:07:37.314 --rc genhtml_function_coverage=1 00:07:37.314 --rc genhtml_legend=1 00:07:37.314 --rc geninfo_all_blocks=1 00:07:37.314 --rc geninfo_unexecuted_blocks=1 00:07:37.314 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.314 ' 00:07:37.314 07:32:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:37.314 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.314 --rc genhtml_branch_coverage=1 00:07:37.314 --rc genhtml_function_coverage=1 00:07:37.314 --rc genhtml_legend=1 00:07:37.314 --rc geninfo_all_blocks=1 00:07:37.314 --rc geninfo_unexecuted_blocks=1 00:07:37.314 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.314 ' 00:07:37.314 07:32:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:37.314 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.314 --rc genhtml_branch_coverage=1 00:07:37.314 --rc genhtml_function_coverage=1 00:07:37.314 --rc genhtml_legend=1 00:07:37.314 --rc geninfo_all_blocks=1 00:07:37.314 --rc geninfo_unexecuted_blocks=1 00:07:37.314 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.314 ' 00:07:37.314 07:32:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:37.314 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.314 --rc genhtml_branch_coverage=1 00:07:37.314 --rc genhtml_function_coverage=1 00:07:37.314 --rc genhtml_legend=1 00:07:37.314 --rc geninfo_all_blocks=1 00:07:37.314 --rc geninfo_unexecuted_blocks=1 00:07:37.314 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.314 ' 00:07:37.314 07:32:48 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:37.314 07:32:48 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:37.314 07:32:48 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:37.314 07:32:48 -- common/autotest_common.sh@34 -- # set -e 00:07:37.314 07:32:48 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:37.314 07:32:48 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:37.314 07:32:48 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:37.314 07:32:48 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:37.314 07:32:48 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:37.314 07:32:48 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:37.314 07:32:48 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:37.314 07:32:48 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:37.314 07:32:48 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:37.314 07:32:48 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:37.314 07:32:48 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:37.314 07:32:48 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:37.314 07:32:48 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:37.314 07:32:48 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:37.314 07:32:48 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:37.314 07:32:48 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:37.314 07:32:48 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:37.314 07:32:48 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:37.314 07:32:48 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:37.314 07:32:48 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:37.314 07:32:48 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:37.314 07:32:48 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:37.314 07:32:48 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:37.314 07:32:48 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:37.314 07:32:48 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:37.314 07:32:48 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:37.314 07:32:48 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:37.314 07:32:48 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:37.314 07:32:48 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:37.314 07:32:48 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:37.314 07:32:48 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:37.314 07:32:48 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:37.314 07:32:48 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:37.314 07:32:48 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:37.314 07:32:48 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:37.314 07:32:48 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:37.314 07:32:48 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:37.314 07:32:48 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:37.314 07:32:48 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:37.314 07:32:48 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:37.314 07:32:48 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:37.314 07:32:48 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:37.314 07:32:48 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:37.314 07:32:48 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:37.314 07:32:48 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:37.314 07:32:48 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:37.314 07:32:48 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:37.314 07:32:48 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:37.314 07:32:48 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:37.314 07:32:48 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:37.314 07:32:48 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:37.314 07:32:48 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:37.314 07:32:48 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:37.314 07:32:48 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:37.314 07:32:48 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:37.314 07:32:48 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:37.314 07:32:48 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:37.314 07:32:48 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:37.314 07:32:48 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:37.314 07:32:48 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:37.314 07:32:48 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:37.314 07:32:48 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:37.314 07:32:48 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:37.314 07:32:48 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:37.314 07:32:48 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:37.314 07:32:48 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:37.314 07:32:48 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:37.314 07:32:48 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:37.314 07:32:48 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:37.314 07:32:48 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:37.314 07:32:48 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:37.314 07:32:48 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:37.314 07:32:48 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:37.314 07:32:48 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:37.314 07:32:48 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:37.314 07:32:48 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:37.314 07:32:48 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:37.314 07:32:48 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:37.314 07:32:48 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:37.314 07:32:48 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:37.314 07:32:48 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:37.314 07:32:48 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:37.314 07:32:48 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:37.314 07:32:48 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:37.314 07:32:48 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:37.314 07:32:48 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:37.314 07:32:48 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:37.314 07:32:48 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:37.314 07:32:48 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:37.314 07:32:48 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:37.314 07:32:48 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:37.314 07:32:48 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:37.314 07:32:48 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:37.314 07:32:48 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:37.315 07:32:48 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:37.315 07:32:48 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:37.315 07:32:48 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:37.315 07:32:48 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:37.315 07:32:48 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:37.315 #define SPDK_CONFIG_H 00:07:37.315 #define SPDK_CONFIG_APPS 1 00:07:37.315 #define SPDK_CONFIG_ARCH native 00:07:37.315 #undef SPDK_CONFIG_ASAN 00:07:37.315 #undef SPDK_CONFIG_AVAHI 00:07:37.315 #undef SPDK_CONFIG_CET 00:07:37.315 #define SPDK_CONFIG_COVERAGE 1 00:07:37.315 #define SPDK_CONFIG_CROSS_PREFIX 00:07:37.315 #undef SPDK_CONFIG_CRYPTO 00:07:37.315 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:37.315 #undef SPDK_CONFIG_CUSTOMOCF 00:07:37.315 #undef SPDK_CONFIG_DAOS 00:07:37.315 #define SPDK_CONFIG_DAOS_DIR 00:07:37.315 #define SPDK_CONFIG_DEBUG 1 00:07:37.315 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:37.315 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:37.315 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:37.315 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:37.315 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:37.315 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:37.315 #define SPDK_CONFIG_EXAMPLES 1 00:07:37.315 #undef SPDK_CONFIG_FC 00:07:37.315 #define SPDK_CONFIG_FC_PATH 00:07:37.315 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:37.315 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:37.315 #undef SPDK_CONFIG_FUSE 00:07:37.315 #define SPDK_CONFIG_FUZZER 1 00:07:37.315 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:37.315 #undef SPDK_CONFIG_GOLANG 00:07:37.315 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:37.315 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:37.315 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:37.315 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:37.315 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:37.315 #define SPDK_CONFIG_IDXD 1 00:07:37.315 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:37.315 #undef SPDK_CONFIG_IPSEC_MB 00:07:37.315 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:37.315 #define SPDK_CONFIG_ISAL 1 00:07:37.315 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:37.315 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:37.315 #define SPDK_CONFIG_LIBDIR 00:07:37.315 #undef SPDK_CONFIG_LTO 00:07:37.315 #define SPDK_CONFIG_MAX_LCORES 00:07:37.315 #define SPDK_CONFIG_NVME_CUSE 1 00:07:37.315 #undef SPDK_CONFIG_OCF 00:07:37.315 #define SPDK_CONFIG_OCF_PATH 00:07:37.315 #define SPDK_CONFIG_OPENSSL_PATH 00:07:37.315 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:37.315 #undef SPDK_CONFIG_PGO_USE 00:07:37.315 #define SPDK_CONFIG_PREFIX /usr/local 00:07:37.315 #undef SPDK_CONFIG_RAID5F 00:07:37.315 #undef SPDK_CONFIG_RBD 00:07:37.315 #define SPDK_CONFIG_RDMA 1 00:07:37.315 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:37.315 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:37.315 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:37.315 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:37.315 #undef SPDK_CONFIG_SHARED 00:07:37.315 #undef SPDK_CONFIG_SMA 00:07:37.315 #define SPDK_CONFIG_TESTS 1 00:07:37.315 #undef SPDK_CONFIG_TSAN 00:07:37.315 #define SPDK_CONFIG_UBLK 1 00:07:37.315 #define SPDK_CONFIG_UBSAN 1 00:07:37.315 #undef SPDK_CONFIG_UNIT_TESTS 00:07:37.315 #undef SPDK_CONFIG_URING 00:07:37.315 #define SPDK_CONFIG_URING_PATH 00:07:37.315 #undef SPDK_CONFIG_URING_ZNS 00:07:37.315 #undef SPDK_CONFIG_USDT 00:07:37.315 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:37.315 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:37.315 #define SPDK_CONFIG_VFIO_USER 1 00:07:37.315 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:37.315 #define SPDK_CONFIG_VHOST 1 00:07:37.315 #define SPDK_CONFIG_VIRTIO 1 00:07:37.315 #undef SPDK_CONFIG_VTUNE 00:07:37.315 #define SPDK_CONFIG_VTUNE_DIR 00:07:37.315 #define SPDK_CONFIG_WERROR 1 00:07:37.315 #define SPDK_CONFIG_WPDK_DIR 00:07:37.315 #undef SPDK_CONFIG_XNVME 00:07:37.315 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:37.315 07:32:48 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:37.315 07:32:48 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:37.315 07:32:48 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:37.315 07:32:48 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:37.315 07:32:48 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:37.315 07:32:48 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:37.315 07:32:48 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:37.315 07:32:48 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:37.315 07:32:48 -- paths/export.sh@5 -- # export PATH 00:07:37.315 07:32:48 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:37.315 07:32:48 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:37.315 07:32:48 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:37.315 07:32:48 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:37.315 07:32:48 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:37.315 07:32:48 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:37.315 07:32:48 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:37.315 07:32:48 -- pm/common@16 -- # TEST_TAG=N/A 00:07:37.315 07:32:48 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:37.315 07:32:48 -- common/autotest_common.sh@52 -- # : 1 00:07:37.315 07:32:48 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:37.315 07:32:48 -- common/autotest_common.sh@56 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:37.315 07:32:48 -- common/autotest_common.sh@58 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:37.315 07:32:48 -- common/autotest_common.sh@60 -- # : 1 00:07:37.315 07:32:48 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:37.315 07:32:48 -- common/autotest_common.sh@62 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:37.315 07:32:48 -- common/autotest_common.sh@64 -- # : 00:07:37.315 07:32:48 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:37.315 07:32:48 -- common/autotest_common.sh@66 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:37.315 07:32:48 -- common/autotest_common.sh@68 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:37.315 07:32:48 -- common/autotest_common.sh@70 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:37.315 07:32:48 -- common/autotest_common.sh@72 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:37.315 07:32:48 -- common/autotest_common.sh@74 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:37.315 07:32:48 -- common/autotest_common.sh@76 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:37.315 07:32:48 -- common/autotest_common.sh@78 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:37.315 07:32:48 -- common/autotest_common.sh@80 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:37.315 07:32:48 -- common/autotest_common.sh@82 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:37.315 07:32:48 -- common/autotest_common.sh@84 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:37.315 07:32:48 -- common/autotest_common.sh@86 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:37.315 07:32:48 -- common/autotest_common.sh@88 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:37.315 07:32:48 -- common/autotest_common.sh@90 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:37.315 07:32:48 -- common/autotest_common.sh@92 -- # : 1 00:07:37.315 07:32:48 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:37.315 07:32:48 -- common/autotest_common.sh@94 -- # : 1 00:07:37.315 07:32:48 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:37.315 07:32:48 -- common/autotest_common.sh@96 -- # : rdma 00:07:37.315 07:32:48 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:37.315 07:32:48 -- common/autotest_common.sh@98 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:37.315 07:32:48 -- common/autotest_common.sh@100 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:37.315 07:32:48 -- common/autotest_common.sh@102 -- # : 0 00:07:37.315 07:32:48 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:37.577 07:32:48 -- common/autotest_common.sh@104 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:37.577 07:32:48 -- common/autotest_common.sh@106 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:37.577 07:32:48 -- common/autotest_common.sh@108 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:37.577 07:32:48 -- common/autotest_common.sh@110 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:37.577 07:32:48 -- common/autotest_common.sh@112 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:37.577 07:32:48 -- common/autotest_common.sh@114 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:37.577 07:32:48 -- common/autotest_common.sh@116 -- # : 1 00:07:37.577 07:32:48 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:37.577 07:32:48 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:37.577 07:32:48 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:37.577 07:32:48 -- common/autotest_common.sh@120 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:37.577 07:32:48 -- common/autotest_common.sh@122 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:37.577 07:32:48 -- common/autotest_common.sh@124 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:37.577 07:32:48 -- common/autotest_common.sh@126 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:37.577 07:32:48 -- common/autotest_common.sh@128 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:37.577 07:32:48 -- common/autotest_common.sh@130 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:37.577 07:32:48 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:37.577 07:32:48 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:37.577 07:32:48 -- common/autotest_common.sh@134 -- # : true 00:07:37.577 07:32:48 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:37.577 07:32:48 -- common/autotest_common.sh@136 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:37.577 07:32:48 -- common/autotest_common.sh@138 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:37.577 07:32:48 -- common/autotest_common.sh@140 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:37.577 07:32:48 -- common/autotest_common.sh@142 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:37.577 07:32:48 -- common/autotest_common.sh@144 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:37.577 07:32:48 -- common/autotest_common.sh@146 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:37.577 07:32:48 -- common/autotest_common.sh@148 -- # : 00:07:37.577 07:32:48 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:37.577 07:32:48 -- common/autotest_common.sh@150 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:37.577 07:32:48 -- common/autotest_common.sh@152 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:37.577 07:32:48 -- common/autotest_common.sh@154 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:37.577 07:32:48 -- common/autotest_common.sh@156 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:37.577 07:32:48 -- common/autotest_common.sh@158 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:37.577 07:32:48 -- common/autotest_common.sh@160 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:37.577 07:32:48 -- common/autotest_common.sh@163 -- # : 00:07:37.577 07:32:48 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:37.577 07:32:48 -- common/autotest_common.sh@165 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:37.577 07:32:48 -- common/autotest_common.sh@167 -- # : 0 00:07:37.577 07:32:48 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:37.577 07:32:48 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.577 07:32:48 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:37.577 07:32:48 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:37.578 07:32:48 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:37.578 07:32:48 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:37.578 07:32:48 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:37.578 07:32:48 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:37.578 07:32:48 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:37.578 07:32:48 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:37.578 07:32:48 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:37.578 07:32:48 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:37.578 07:32:48 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:37.578 07:32:48 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:37.578 07:32:48 -- common/autotest_common.sh@196 -- # cat 00:07:37.578 07:32:48 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:37.578 07:32:48 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:37.578 07:32:48 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:37.578 07:32:48 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:37.578 07:32:48 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:37.578 07:32:48 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:37.578 07:32:48 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:37.578 07:32:48 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:37.578 07:32:48 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:37.578 07:32:48 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:37.578 07:32:48 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:37.578 07:32:48 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:37.578 07:32:48 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:37.578 07:32:48 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:37.578 07:32:48 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:37.578 07:32:48 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:37.578 07:32:48 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:37.578 07:32:48 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:37.578 07:32:48 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:37.578 07:32:48 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:37.578 07:32:48 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:37.578 07:32:48 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:37.578 07:32:48 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:37.578 07:32:48 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:37.578 07:32:48 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:37.578 07:32:48 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:37.578 07:32:48 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:37.578 07:32:48 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:37.578 07:32:48 -- common/autotest_common.sh@259 -- # valgrind= 00:07:37.578 07:32:48 -- common/autotest_common.sh@265 -- # uname -s 00:07:37.578 07:32:48 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:37.578 07:32:48 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:37.578 07:32:48 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:37.578 07:32:48 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:37.578 07:32:48 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:37.578 07:32:48 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:37.578 07:32:48 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:37.578 07:32:48 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:37.578 07:32:48 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:37.578 07:32:48 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:37.578 07:32:48 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:37.578 07:32:48 -- common/autotest_common.sh@319 -- # [[ -z 1660200 ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@319 -- # kill -0 1660200 00:07:37.578 07:32:48 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:37.578 07:32:48 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:37.578 07:32:48 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:37.578 07:32:48 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:37.578 07:32:48 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:37.578 07:32:48 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:37.578 07:32:48 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:37.578 07:32:48 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.Skm5J1 00:07:37.578 07:32:48 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:37.578 07:32:48 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:37.578 07:32:48 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.Skm5J1/tests/nvmf /tmp/spdk.Skm5J1 00:07:37.578 07:32:48 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@328 -- # df -T 00:07:37.578 07:32:48 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=51913093120 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=9817513984 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863560704 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=1744896 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:37.578 07:32:48 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:37.578 07:32:48 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:37.578 07:32:48 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:37.578 07:32:48 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:37.578 * Looking for test storage... 00:07:37.579 07:32:48 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:37.579 07:32:48 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:37.579 07:32:48 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.579 07:32:48 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:37.579 07:32:48 -- common/autotest_common.sh@373 -- # mount=/ 00:07:37.579 07:32:48 -- common/autotest_common.sh@375 -- # target_space=51913093120 00:07:37.579 07:32:48 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:37.579 07:32:48 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:37.579 07:32:48 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:37.579 07:32:48 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:37.579 07:32:48 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:37.579 07:32:48 -- common/autotest_common.sh@382 -- # new_size=12032106496 00:07:37.579 07:32:48 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:37.579 07:32:48 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.579 07:32:48 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.579 07:32:48 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.579 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.579 07:32:48 -- common/autotest_common.sh@390 -- # return 0 00:07:37.579 07:32:48 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:37.579 07:32:48 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:37.579 07:32:48 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:37.579 07:32:48 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:37.579 07:32:48 -- common/autotest_common.sh@1682 -- # true 00:07:37.579 07:32:48 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:37.579 07:32:48 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:37.579 07:32:48 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:37.579 07:32:48 -- common/autotest_common.sh@27 -- # exec 00:07:37.579 07:32:48 -- common/autotest_common.sh@29 -- # exec 00:07:37.579 07:32:48 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:37.579 07:32:48 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:37.579 07:32:48 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:37.579 07:32:48 -- common/autotest_common.sh@18 -- # set -x 00:07:37.579 07:32:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:37.579 07:32:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:37.579 07:32:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:37.579 07:32:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:37.579 07:32:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:37.579 07:32:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:37.579 07:32:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:37.579 07:32:48 -- scripts/common.sh@335 -- # IFS=.-: 00:07:37.579 07:32:48 -- scripts/common.sh@335 -- # read -ra ver1 00:07:37.579 07:32:48 -- scripts/common.sh@336 -- # IFS=.-: 00:07:37.579 07:32:48 -- scripts/common.sh@336 -- # read -ra ver2 00:07:37.579 07:32:48 -- scripts/common.sh@337 -- # local 'op=<' 00:07:37.579 07:32:48 -- scripts/common.sh@339 -- # ver1_l=2 00:07:37.579 07:32:48 -- scripts/common.sh@340 -- # ver2_l=1 00:07:37.579 07:32:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:37.579 07:32:48 -- scripts/common.sh@343 -- # case "$op" in 00:07:37.579 07:32:48 -- scripts/common.sh@344 -- # : 1 00:07:37.579 07:32:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:37.579 07:32:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:37.579 07:32:48 -- scripts/common.sh@364 -- # decimal 1 00:07:37.579 07:32:48 -- scripts/common.sh@352 -- # local d=1 00:07:37.579 07:32:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:37.579 07:32:48 -- scripts/common.sh@354 -- # echo 1 00:07:37.579 07:32:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:37.579 07:32:48 -- scripts/common.sh@365 -- # decimal 2 00:07:37.579 07:32:48 -- scripts/common.sh@352 -- # local d=2 00:07:37.579 07:32:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:37.579 07:32:48 -- scripts/common.sh@354 -- # echo 2 00:07:37.579 07:32:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:37.579 07:32:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:37.579 07:32:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:37.579 07:32:48 -- scripts/common.sh@367 -- # return 0 00:07:37.579 07:32:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:37.579 07:32:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:37.579 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.579 --rc genhtml_branch_coverage=1 00:07:37.579 --rc genhtml_function_coverage=1 00:07:37.579 --rc genhtml_legend=1 00:07:37.579 --rc geninfo_all_blocks=1 00:07:37.579 --rc geninfo_unexecuted_blocks=1 00:07:37.579 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.579 ' 00:07:37.579 07:32:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:37.579 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.579 --rc genhtml_branch_coverage=1 00:07:37.579 --rc genhtml_function_coverage=1 00:07:37.579 --rc genhtml_legend=1 00:07:37.579 --rc geninfo_all_blocks=1 00:07:37.579 --rc geninfo_unexecuted_blocks=1 00:07:37.579 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.579 ' 00:07:37.579 07:32:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:37.579 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.579 --rc genhtml_branch_coverage=1 00:07:37.579 --rc genhtml_function_coverage=1 00:07:37.579 --rc genhtml_legend=1 00:07:37.579 --rc geninfo_all_blocks=1 00:07:37.579 --rc geninfo_unexecuted_blocks=1 00:07:37.579 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.579 ' 00:07:37.579 07:32:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:37.579 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.579 --rc genhtml_branch_coverage=1 00:07:37.579 --rc genhtml_function_coverage=1 00:07:37.579 --rc genhtml_legend=1 00:07:37.579 --rc geninfo_all_blocks=1 00:07:37.579 --rc geninfo_unexecuted_blocks=1 00:07:37.579 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.579 ' 00:07:37.579 07:32:48 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:37.579 07:32:48 -- ../common.sh@8 -- # pids=() 00:07:37.579 07:32:48 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:37.579 07:32:48 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:37.579 07:32:48 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:37.579 07:32:48 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:37.579 07:32:48 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:37.579 07:32:48 -- nvmf/run.sh@61 -- # mem_size=512 00:07:37.579 07:32:48 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:37.579 07:32:48 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:37.579 07:32:48 -- ../common.sh@69 -- # local fuzz_num=25 00:07:37.579 07:32:48 -- ../common.sh@70 -- # local time=1 00:07:37.579 07:32:48 -- ../common.sh@72 -- # (( i = 0 )) 00:07:37.579 07:32:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.579 07:32:48 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:37.579 07:32:48 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:37.579 07:32:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.579 07:32:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.579 07:32:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:37.579 07:32:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:37.579 07:32:48 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:37.579 07:32:48 -- nvmf/run.sh@29 -- # port=4400 00:07:37.579 07:32:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:37.579 07:32:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:37.579 07:32:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.579 07:32:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:37.579 [2024-11-28 07:32:48.303782] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:37.579 [2024-11-28 07:32:48.303845] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660260 ] 00:07:37.579 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.839 [2024-11-28 07:32:48.478293] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.839 [2024-11-28 07:32:48.497683] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.839 [2024-11-28 07:32:48.497803] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.839 [2024-11-28 07:32:48.549030] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.839 [2024-11-28 07:32:48.565388] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:37.839 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.839 INFO: Seed: 4040557287 00:07:37.839 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:37.839 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:37.839 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:37.839 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.839 #2 INITED exec/s: 0 rss: 59Mb 00:07:37.839 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.839 This may also happen if the target rejected all inputs we tried so far 00:07:38.098 [2024-11-28 07:32:48.610471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.098 [2024-11-28 07:32:48.610501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 NEW_FUNC[1/671]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:38.357 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.357 #7 NEW cov: 11562 ft: 11563 corp: 2/82b lim: 320 exec/s: 0 rss: 67Mb L: 81/81 MS: 5 ChangeByte-InsertByte-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:38.357 [2024-11-28 07:32:48.911189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.357 [2024-11-28 07:32:48.911221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 #8 NEW cov: 11675 ft: 12035 corp: 3/185b lim: 320 exec/s: 0 rss: 67Mb L: 103/103 MS: 1 InsertRepeatedBytes- 00:07:38.357 [2024-11-28 07:32:48.951256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:38.357 [2024-11-28 07:32:48.951283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 #9 NEW cov: 11681 ft: 12221 corp: 4/288b lim: 320 exec/s: 0 rss: 67Mb L: 103/103 MS: 1 ChangeByte- 00:07:38.357 [2024-11-28 07:32:48.991485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.357 [2024-11-28 07:32:48.991514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 [2024-11-28 07:32:48.991569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.357 [2024-11-28 07:32:48.991584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.357 #10 NEW cov: 11766 ft: 12718 corp: 5/433b lim: 320 exec/s: 0 rss: 67Mb L: 145/145 MS: 1 CrossOver- 00:07:38.357 [2024-11-28 07:32:49.031464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:38.357 [2024-11-28 07:32:49.031490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 #11 NEW cov: 11766 ft: 12852 corp: 6/536b lim: 320 exec/s: 0 rss: 67Mb L: 103/145 MS: 1 ChangeByte- 00:07:38.357 [2024-11-28 07:32:49.071554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.357 [2024-11-28 07:32:49.071584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 #12 NEW cov: 11766 ft: 12943 corp: 7/640b lim: 320 exec/s: 0 rss: 67Mb L: 104/145 MS: 1 InsertByte- 00:07:38.357 [2024-11-28 07:32:49.101648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.357 [2024-11-28 07:32:49.101675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.357 #18 NEW cov: 11766 ft: 13000 corp: 8/743b lim: 320 exec/s: 0 rss: 67Mb L: 103/145 MS: 1 ChangeBit- 00:07:38.616 [2024-11-28 07:32:49.131867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:cececece cdw11:cececece SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:38.616 [2024-11-28 07:32:49.131894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.616 [2024-11-28 07:32:49.131951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ce) qid:0 cid:5 nsid:cececece cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffce 00:07:38.616 [2024-11-28 07:32:49.131965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.616 #19 NEW cov: 11785 ft: 13113 corp: 9/900b lim: 320 exec/s: 0 rss: 67Mb L: 157/157 MS: 1 InsertRepeatedBytes- 00:07:38.616 [2024-11-28 07:32:49.171844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.616 [2024-11-28 07:32:49.171870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.616 #20 NEW cov: 11785 ft: 13244 corp: 10/1003b lim: 320 exec/s: 0 rss: 67Mb L: 103/157 MS: 1 ChangeBinInt- 00:07:38.616 [2024-11-28 07:32:49.201932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:38.616 [2024-11-28 07:32:49.201958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.616 #21 NEW cov: 11785 ft: 13312 corp: 11/1126b lim: 320 exec/s: 0 rss: 67Mb L: 123/157 MS: 1 InsertRepeatedBytes- 00:07:38.616 [2024-11-28 07:32:49.242187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:cececece cdw11:cececece SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:38.616 [2024-11-28 07:32:49.242213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.616 [2024-11-28 07:32:49.242271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ce) qid:0 cid:5 nsid:cececece cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffce 00:07:38.616 [2024-11-28 07:32:49.242286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.616 #22 NEW cov: 11785 ft: 13377 corp: 12/1283b lim: 320 exec/s: 0 rss: 67Mb L: 157/157 MS: 1 CrossOver- 00:07:38.616 [2024-11-28 07:32:49.282177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.616 [2024-11-28 07:32:49.282202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.616 #23 NEW cov: 11785 ft: 13392 corp: 13/1386b lim: 320 exec/s: 0 rss: 68Mb L: 103/157 MS: 1 ChangeByte- 00:07:38.616 [2024-11-28 07:32:49.322249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ff32ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.616 [2024-11-28 07:32:49.322280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.616 #24 NEW cov: 11785 ft: 13405 corp: 14/1489b lim: 320 exec/s: 0 rss: 68Mb L: 103/157 MS: 1 ChangeByte- 00:07:38.616 [2024-11-28 07:32:49.362380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.617 [2024-11-28 07:32:49.362407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.617 #25 NEW cov: 11785 ft: 13436 corp: 15/1593b lim: 320 exec/s: 0 rss: 68Mb L: 104/157 MS: 1 InsertByte- 00:07:38.875 [2024-11-28 07:32:49.392458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffff3affff 00:07:38.876 [2024-11-28 07:32:49.392485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 #26 NEW cov: 11785 ft: 13521 corp: 16/1696b lim: 320 exec/s: 0 rss: 68Mb L: 103/157 MS: 1 ChangeByte- 00:07:38.876 [2024-11-28 07:32:49.432609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffff3affff 00:07:38.876 [2024-11-28 07:32:49.432636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 #27 NEW cov: 11785 ft: 13595 corp: 17/1799b lim: 320 exec/s: 0 rss: 68Mb L: 103/157 MS: 1 ChangeBit- 00:07:38.876 [2024-11-28 07:32:49.472728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffdffffff 00:07:38.876 [2024-11-28 07:32:49.472754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 #28 NEW cov: 11785 ft: 13606 corp: 18/1902b lim: 320 exec/s: 0 rss: 68Mb L: 103/157 MS: 1 ChangeBit- 00:07:38.876 [2024-11-28 07:32:49.502924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.876 [2024-11-28 07:32:49.502950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 [2024-11-28 07:32:49.503002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:38.876 [2024-11-28 07:32:49.503016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.876 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.876 #29 NEW cov: 11810 ft: 13647 corp: 19/2068b lim: 320 exec/s: 0 rss: 68Mb L: 166/166 MS: 1 InsertRepeatedBytes- 00:07:38.876 [2024-11-28 07:32:49.542927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ff31ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.876 [2024-11-28 07:32:49.542953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 #30 NEW cov: 11810 ft: 13670 corp: 20/2171b lim: 320 exec/s: 0 rss: 68Mb L: 103/166 MS: 1 ChangeASCIIInt- 00:07:38.876 [2024-11-28 07:32:49.583069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.876 [2024-11-28 07:32:49.583095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 #31 NEW cov: 11810 ft: 13701 corp: 21/2275b lim: 320 exec/s: 31 rss: 68Mb L: 104/166 MS: 1 ShuffleBytes- 00:07:38.876 [2024-11-28 07:32:49.623194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:38.876 [2024-11-28 07:32:49.623223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.876 #32 NEW cov: 11810 ft: 13792 corp: 22/2378b lim: 320 exec/s: 32 rss: 68Mb L: 103/166 MS: 1 ShuffleBytes- 00:07:39.134 [2024-11-28 07:32:49.653290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.653316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #33 NEW cov: 11810 ft: 13816 corp: 23/2482b lim: 320 exec/s: 33 rss: 68Mb L: 104/166 MS: 1 ShuffleBytes- 00:07:39.134 [2024-11-28 07:32:49.693401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:24ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.693427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #34 NEW cov: 11810 ft: 13829 corp: 24/2585b lim: 320 exec/s: 34 rss: 68Mb L: 103/166 MS: 1 ChangeByte- 00:07:39.134 [2024-11-28 07:32:49.723469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.723495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #35 NEW cov: 11810 ft: 13846 corp: 25/2689b lim: 320 exec/s: 35 rss: 68Mb L: 104/166 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.134 [2024-11-28 07:32:49.763614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.763641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #36 NEW cov: 11810 ft: 13867 corp: 26/2793b lim: 320 exec/s: 36 rss: 68Mb L: 104/166 MS: 1 ShuffleBytes- 00:07:39.134 [2024-11-28 07:32:49.803705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:31ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.803731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #37 NEW cov: 11810 ft: 13894 corp: 27/2897b lim: 320 exec/s: 37 rss: 68Mb L: 104/166 MS: 1 InsertByte- 00:07:39.134 [2024-11-28 07:32:49.843860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ff31ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.843887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #38 NEW cov: 11810 ft: 13910 corp: 28/2987b lim: 320 exec/s: 38 rss: 68Mb L: 90/166 MS: 1 CrossOver- 00:07:39.134 [2024-11-28 07:32:49.883936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.134 [2024-11-28 07:32:49.883963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.134 #39 NEW cov: 11810 ft: 13916 corp: 29/3106b lim: 320 exec/s: 39 rss: 68Mb L: 119/166 MS: 1 CopyPart- 00:07:39.394 [2024-11-28 07:32:49.924139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.394 [2024-11-28 07:32:49.924166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.394 [2024-11-28 07:32:49.924230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.394 [2024-11-28 07:32:49.924244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.394 #40 NEW cov: 11810 ft: 13926 corp: 30/3252b lim: 320 exec/s: 40 rss: 69Mb L: 146/166 MS: 1 InsertByte- 00:07:39.394 [2024-11-28 07:32:49.964323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:cececece cdw11:0000cece SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:39.394 [2024-11-28 07:32:49.964350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.394 [2024-11-28 07:32:49.964413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ce) qid:0 cid:5 nsid:cececece cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xcececececececece 00:07:39.394 [2024-11-28 07:32:49.964427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.394 #41 NEW cov: 11810 ft: 13972 corp: 31/3417b lim: 320 exec/s: 41 rss: 69Mb L: 165/166 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.394 [2024-11-28 07:32:50.004490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ff31ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.394 [2024-11-28 07:32:50.004518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.394 #42 NEW cov: 11810 ft: 14038 corp: 32/3520b lim: 320 exec/s: 42 rss: 69Mb L: 103/166 MS: 1 ChangeBit- 00:07:39.394 [2024-11-28 07:32:50.044431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffdffffff 00:07:39.394 [2024-11-28 07:32:50.044458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.394 #43 NEW cov: 11810 ft: 14048 corp: 33/3623b lim: 320 exec/s: 43 rss: 69Mb L: 103/166 MS: 1 ChangeByte- 00:07:39.394 [2024-11-28 07:32:50.084548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff00000009 00:07:39.394 [2024-11-28 07:32:50.084575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.394 #48 NEW cov: 11810 ft: 14060 corp: 34/3695b lim: 320 exec/s: 48 rss: 69Mb L: 72/166 MS: 5 EraseBytes-ChangeASCIIInt-ChangeBinInt-CopyPart-CopyPart- 00:07:39.394 [2024-11-28 07:32:50.124616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ff31ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.394 [2024-11-28 07:32:50.124642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.394 #49 NEW cov: 11810 ft: 14121 corp: 35/3806b lim: 320 exec/s: 49 rss: 69Mb L: 111/166 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.653 [2024-11-28 07:32:50.164941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.653 [2024-11-28 07:32:50.164968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.653 [2024-11-28 07:32:50.165030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.653 [2024-11-28 07:32:50.165044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.653 [2024-11-28 07:32:50.165108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffcdffffff3a 00:07:39.653 [2024-11-28 07:32:50.165123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.653 #50 NEW cov: 11810 ft: 14339 corp: 36/4035b lim: 320 exec/s: 50 rss: 69Mb L: 229/229 MS: 1 InsertRepeatedBytes- 00:07:39.653 [2024-11-28 07:32:50.204867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.653 [2024-11-28 07:32:50.204893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.653 #51 NEW cov: 11810 ft: 14364 corp: 37/4139b lim: 320 exec/s: 51 rss: 69Mb L: 104/229 MS: 1 ChangeByte- 00:07:39.653 [2024-11-28 07:32:50.245020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.653 [2024-11-28 07:32:50.245047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.653 #56 NEW cov: 11810 ft: 14389 corp: 38/4256b lim: 320 exec/s: 56 rss: 69Mb L: 117/229 MS: 5 CrossOver-ShuffleBytes-EraseBytes-CrossOver-CrossOver- 00:07:39.653 [2024-11-28 07:32:50.285096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xd7637068b0ffffff 00:07:39.653 [2024-11-28 07:32:50.285123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.653 #57 NEW cov: 11810 ft: 14410 corp: 39/4359b lim: 320 exec/s: 57 rss: 69Mb L: 103/229 MS: 1 CMP- DE: "\260hpc\327\373\222\000"- 00:07:39.654 [2024-11-28 07:32:50.315165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.654 [2024-11-28 07:32:50.315192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.654 #60 NEW cov: 11810 ft: 14437 corp: 40/4424b lim: 320 exec/s: 60 rss: 69Mb L: 65/229 MS: 3 CopyPart-CrossOver-InsertRepeatedBytes- 00:07:39.654 [2024-11-28 07:32:50.355349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:000000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.654 [2024-11-28 07:32:50.355375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.654 #61 NEW cov: 11810 ft: 14449 corp: 41/4502b lim: 320 exec/s: 61 rss: 69Mb L: 78/229 MS: 1 EraseBytes- 00:07:39.654 [2024-11-28 07:32:50.395435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:31ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.654 [2024-11-28 07:32:50.395462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.654 #62 NEW cov: 11810 ft: 14471 corp: 42/4606b lim: 320 exec/s: 62 rss: 69Mb L: 104/229 MS: 1 ChangeBinInt- 00:07:39.913 [2024-11-28 07:32:50.435596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.913 [2024-11-28 07:32:50.435627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.913 #63 NEW cov: 11810 ft: 14479 corp: 43/4710b lim: 320 exec/s: 63 rss: 69Mb L: 104/229 MS: 1 ShuffleBytes- 00:07:39.913 [2024-11-28 07:32:50.475769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.913 [2024-11-28 07:32:50.475799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.913 [2024-11-28 07:32:50.475856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.913 [2024-11-28 07:32:50.475870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.913 #64 NEW cov: 11810 ft: 14482 corp: 44/4897b lim: 320 exec/s: 64 rss: 69Mb L: 187/229 MS: 1 CopyPart- 00:07:39.913 [2024-11-28 07:32:50.515831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:39.913 [2024-11-28 07:32:50.515859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.913 [2024-11-28 07:32:50.545877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:39.913 [2024-11-28 07:32:50.545904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.913 #66 NEW cov: 11810 ft: 14538 corp: 45/5000b lim: 320 exec/s: 66 rss: 69Mb L: 103/229 MS: 2 PersAutoDict-ChangeBinInt- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.913 [2024-11-28 07:32:50.576036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1f) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffcdffffffffffff 00:07:39.913 [2024-11-28 07:32:50.576063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.913 [2024-11-28 07:32:50.576119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.913 [2024-11-28 07:32:50.576133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.913 #67 NEW cov: 11810 ft: 14546 corp: 46/5170b lim: 320 exec/s: 33 rss: 69Mb L: 170/229 MS: 1 CrossOver- 00:07:39.913 #67 DONE cov: 11810 ft: 14546 corp: 46/5170b lim: 320 exec/s: 33 rss: 69Mb 00:07:39.913 ###### Recommended dictionary. ###### 00:07:39.913 "\000\000\000\000\000\000\000\000" # Uses: 3 00:07:39.913 "\260hpc\327\373\222\000" # Uses: 0 00:07:39.913 ###### End of recommended dictionary. ###### 00:07:39.913 Done 67 runs in 2 second(s) 00:07:40.172 07:32:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:40.172 07:32:50 -- ../common.sh@72 -- # (( i++ )) 00:07:40.172 07:32:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.172 07:32:50 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:40.172 07:32:50 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:40.172 07:32:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.172 07:32:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.172 07:32:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:40.172 07:32:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:40.172 07:32:50 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:40.172 07:32:50 -- nvmf/run.sh@29 -- # port=4401 00:07:40.172 07:32:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:40.172 07:32:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:40.172 07:32:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.172 07:32:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:40.172 [2024-11-28 07:32:50.759023] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:40.172 [2024-11-28 07:32:50.759093] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660304 ] 00:07:40.172 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.172 [2024-11-28 07:32:50.942224] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.436 [2024-11-28 07:32:50.962130] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.436 [2024-11-28 07:32:50.962244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.436 [2024-11-28 07:32:51.013580] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.436 [2024-11-28 07:32:51.029934] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:40.436 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.436 INFO: Seed: 2210568285 00:07:40.437 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:40.437 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:40.437 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:40.437 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.437 #2 INITED exec/s: 0 rss: 59Mb 00:07:40.437 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.437 This may also happen if the target rejected all inputs we tried so far 00:07:40.437 [2024-11-28 07:32:51.095234] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.437 [2024-11-28 07:32:51.095396] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.437 [2024-11-28 07:32:51.095717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.437 [2024-11-28 07:32:51.095754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.437 [2024-11-28 07:32:51.095872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.437 [2024-11-28 07:32:51.095887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.701 NEW_FUNC[1/670]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:40.701 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.701 #21 NEW cov: 11625 ft: 11626 corp: 2/14b lim: 30 exec/s: 0 rss: 67Mb L: 13/13 MS: 4 ShuffleBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:40.701 [2024-11-28 07:32:51.406561] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.701 [2024-11-28 07:32:51.406742] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.701 [2024-11-28 07:32:51.407092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.701 [2024-11-28 07:32:51.407136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.701 [2024-11-28 07:32:51.407272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.701 [2024-11-28 07:32:51.407295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.701 NEW_FUNC[1/1]: 0xead838 in spdk_process_is_primary /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:290 00:07:40.701 #32 NEW cov: 11739 ft: 12256 corp: 3/27b lim: 30 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 CopyPart- 00:07:40.701 [2024-11-28 07:32:51.466740] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:40.701 [2024-11-28 07:32:51.466910] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.701 [2024-11-28 07:32:51.467281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.701 [2024-11-28 07:32:51.467308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.701 [2024-11-28 07:32:51.467395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.701 [2024-11-28 07:32:51.467412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.960 #33 NEW cov: 11745 ft: 12496 corp: 4/41b lim: 30 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 InsertByte- 00:07:40.960 [2024-11-28 07:32:51.516919] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000092 00:07:40.960 [2024-11-28 07:32:51.517103] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:40.960 [2024-11-28 07:32:51.517443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.517472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.960 [2024-11-28 07:32:51.517601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:fbd80018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.517619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.960 #34 NEW cov: 11830 ft: 12762 corp: 5/54b lim: 30 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 CMP- DE: "\000\222\373\330\030\000\250\016"- 00:07:40.960 [2024-11-28 07:32:51.567103] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.960 [2024-11-28 07:32:51.567265] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.960 [2024-11-28 07:32:51.567620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.567650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.960 [2024-11-28 07:32:51.567782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.567799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.960 #35 NEW cov: 11830 ft: 12851 corp: 6/70b lim: 30 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:40.960 [2024-11-28 07:32:51.617116] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:07:40.960 [2024-11-28 07:32:51.617276] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (150512) > buf size (4096) 00:07:40.960 [2024-11-28 07:32:51.617640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.617669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.960 [2024-11-28 07:32:51.617790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:92fb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.617807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.960 #36 NEW cov: 11853 ft: 12928 corp: 7/84b lim: 30 exec/s: 0 rss: 67Mb L: 14/16 MS: 1 PersAutoDict- DE: "\000\222\373\330\030\000\250\016"- 00:07:40.960 [2024-11-28 07:32:51.677345] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:40.960 [2024-11-28 07:32:51.677526] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:40.960 [2024-11-28 07:32:51.677925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.677956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.960 [2024-11-28 07:32:51.678090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a80e0018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.960 [2024-11-28 07:32:51.678109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.960 #37 NEW cov: 11866 ft: 13055 corp: 8/97b lim: 30 exec/s: 0 rss: 67Mb L: 13/16 MS: 1 PersAutoDict- DE: "\000\222\373\330\030\000\250\016"- 00:07:41.219 [2024-11-28 07:32:51.737754] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.219 [2024-11-28 07:32:51.737931] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (172092) > buf size (4096) 00:07:41.219 [2024-11-28 07:32:51.738094] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.219 [2024-11-28 07:32:51.738437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.219 [2024-11-28 07:32:51.738468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.219 [2024-11-28 07:32:51.738604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a80e0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.219 [2024-11-28 07:32:51.738621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.220 [2024-11-28 07:32:51.738745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.738761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.220 #38 NEW cov: 11866 ft: 13372 corp: 9/116b lim: 30 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:41.220 [2024-11-28 07:32:51.797900] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.220 [2024-11-28 07:32:51.798066] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.220 [2024-11-28 07:32:51.798227] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.220 [2024-11-28 07:32:51.798605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.798633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.220 [2024-11-28 07:32:51.798762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a80e0018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.798780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.220 [2024-11-28 07:32:51.798897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.798913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.220 #39 NEW cov: 11866 ft: 13454 corp: 10/137b lim: 30 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 PersAutoDict- DE: "\000\222\373\330\030\000\250\016"- 00:07:41.220 [2024-11-28 07:32:51.847933] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:07:41.220 [2024-11-28 07:32:51.848106] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45040) > buf size (4096) 00:07:41.220 [2024-11-28 07:32:51.848442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.848472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.220 [2024-11-28 07:32:51.848601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2bfb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.848618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.220 #40 NEW cov: 11866 ft: 13475 corp: 11/151b lim: 30 exec/s: 0 rss: 68Mb L: 14/21 MS: 1 ChangeByte- 00:07:41.220 [2024-11-28 07:32:51.898193] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.220 [2024-11-28 07:32:51.898355] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.220 [2024-11-28 07:32:51.898731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.898761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.220 [2024-11-28 07:32:51.898884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.898902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.220 #41 NEW cov: 11866 ft: 13570 corp: 12/167b lim: 30 exec/s: 0 rss: 68Mb L: 16/21 MS: 1 CopyPart- 00:07:41.220 [2024-11-28 07:32:51.958207] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.220 [2024-11-28 07:32:51.958578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.220 [2024-11-28 07:32:51.958611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.220 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.220 #45 NEW cov: 11889 ft: 14036 corp: 13/177b lim: 30 exec/s: 0 rss: 68Mb L: 10/21 MS: 4 ChangeBinInt-CopyPart-CopyPart-PersAutoDict- DE: "\000\222\373\330\030\000\250\016"- 00:07:41.479 [2024-11-28 07:32:52.008479] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:07:41.479 [2024-11-28 07:32:52.008665] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45040) > buf size (4096) 00:07:41.479 [2024-11-28 07:32:52.009013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.009041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.009166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2bfb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.009183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.479 #46 NEW cov: 11889 ft: 14110 corp: 14/191b lim: 30 exec/s: 0 rss: 68Mb L: 14/21 MS: 1 ChangeByte- 00:07:41.479 [2024-11-28 07:32:52.058730] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0d 00:07:41.479 [2024-11-28 07:32:52.058903] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.479 [2024-11-28 07:32:52.059260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.059289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.059420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.059438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.479 #47 NEW cov: 11889 ft: 14176 corp: 15/204b lim: 30 exec/s: 47 rss: 68Mb L: 13/21 MS: 1 ChangeBinInt- 00:07:41.479 [2024-11-28 07:32:52.108838] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e7ff 00:07:41.479 [2024-11-28 07:32:52.109001] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.479 [2024-11-28 07:32:52.109361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00928304 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.109387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.109529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a80e0018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.109547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.479 #48 NEW cov: 11889 ft: 14187 corp: 16/217b lim: 30 exec/s: 48 rss: 68Mb L: 13/21 MS: 1 ChangeBinInt- 00:07:41.479 [2024-11-28 07:32:52.159192] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.479 [2024-11-28 07:32:52.159358] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.479 [2024-11-28 07:32:52.159507] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.479 [2024-11-28 07:32:52.159685] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (150512) > buf size (4096) 00:07:41.479 [2024-11-28 07:32:52.160066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.160093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.160219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a80e0018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.160238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.160366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00d80018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.160384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.160512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:92fb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.160530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.479 #49 NEW cov: 11889 ft: 14743 corp: 17/243b lim: 30 exec/s: 49 rss: 68Mb L: 26/26 MS: 1 CopyPart- 00:07:41.479 [2024-11-28 07:32:52.219225] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0d 00:07:41.479 [2024-11-28 07:32:52.219408] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.479 [2024-11-28 07:32:52.219794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff839d cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.219828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.479 [2024-11-28 07:32:52.219963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.479 [2024-11-28 07:32:52.219980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.479 #50 NEW cov: 11889 ft: 14764 corp: 18/256b lim: 30 exec/s: 50 rss: 68Mb L: 13/26 MS: 1 ChangeByte- 00:07:41.739 [2024-11-28 07:32:52.269371] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.739 [2024-11-28 07:32:52.269536] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.739 [2024-11-28 07:32:52.269930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.269958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.270088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.270105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.739 #51 NEW cov: 11889 ft: 14864 corp: 19/269b lim: 30 exec/s: 51 rss: 68Mb L: 13/26 MS: 1 CopyPart- 00:07:41.739 [2024-11-28 07:32:52.319581] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:41.739 [2024-11-28 07:32:52.319757] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.739 [2024-11-28 07:32:52.319920] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.739 [2024-11-28 07:32:52.320270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.320298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.320421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.320437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.320561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.320580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.739 #52 NEW cov: 11889 ft: 14894 corp: 20/287b lim: 30 exec/s: 52 rss: 68Mb L: 18/26 MS: 1 CrossOver- 00:07:41.739 [2024-11-28 07:32:52.359130] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x180e 00:07:41.739 [2024-11-28 07:32:52.359292] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.739 [2024-11-28 07:32:52.359654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.359682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.359813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.359834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.739 #53 NEW cov: 11889 ft: 14955 corp: 21/302b lim: 30 exec/s: 53 rss: 68Mb L: 15/26 MS: 1 EraseBytes- 00:07:41.739 [2024-11-28 07:32:52.399200] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.739 [2024-11-28 07:32:52.399574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fbd80018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.399603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.739 #54 NEW cov: 11889 ft: 14975 corp: 22/310b lim: 30 exec/s: 54 rss: 68Mb L: 8/26 MS: 1 EraseBytes- 00:07:41.739 [2024-11-28 07:32:52.450041] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6144) > len (588) 00:07:41.739 [2024-11-28 07:32:52.450197] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.739 [2024-11-28 07:32:52.450340] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa80e 00:07:41.739 [2024-11-28 07:32:52.450486] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (150512) > buf size (4096) 00:07:41.739 [2024-11-28 07:32:52.450823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.450851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.450980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a80e0018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.450999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.451126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00d80018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.451144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.739 [2024-11-28 07:32:52.451272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:92fb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.739 [2024-11-28 07:32:52.451292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.739 #55 NEW cov: 11889 ft: 15059 corp: 23/336b lim: 30 exec/s: 55 rss: 68Mb L: 26/26 MS: 1 ChangeByte- 00:07:41.999 [2024-11-28 07:32:52.510033] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x180a 00:07:41.999 [2024-11-28 07:32:52.510356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.510383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.999 #56 NEW cov: 11889 ft: 15068 corp: 24/342b lim: 30 exec/s: 56 rss: 68Mb L: 6/26 MS: 1 CrossOver- 00:07:41.999 [2024-11-28 07:32:52.560205] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (6360) > len (588) 00:07:41.999 [2024-11-28 07:32:52.560552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.560580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.999 #57 NEW cov: 11889 ft: 15071 corp: 25/352b lim: 30 exec/s: 57 rss: 68Mb L: 10/26 MS: 1 ShuffleBytes- 00:07:41.999 [2024-11-28 07:32:52.610342] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa8a1 00:07:41.999 [2024-11-28 07:32:52.610704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fbd80018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.610733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.999 #58 NEW cov: 11889 ft: 15111 corp: 26/361b lim: 30 exec/s: 58 rss: 68Mb L: 9/26 MS: 1 InsertByte- 00:07:41.999 [2024-11-28 07:32:52.660574] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.999 [2024-11-28 07:32:52.660756] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:41.999 [2024-11-28 07:32:52.661095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.661123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.999 [2024-11-28 07:32:52.661240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.661257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.999 #59 NEW cov: 11889 ft: 15112 corp: 27/377b lim: 30 exec/s: 59 rss: 69Mb L: 16/26 MS: 1 ChangeBinInt- 00:07:41.999 [2024-11-28 07:32:52.710696] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa8a1 00:07:41.999 [2024-11-28 07:32:52.711056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fbd80018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.711082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.999 #60 NEW cov: 11889 ft: 15122 corp: 28/387b lim: 30 exec/s: 60 rss: 69Mb L: 10/26 MS: 1 InsertByte- 00:07:41.999 [2024-11-28 07:32:52.760911] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:41.999 [2024-11-28 07:32:52.761078] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:41.999 [2024-11-28 07:32:52.761443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.761474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.999 [2024-11-28 07:32:52.761602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.999 [2024-11-28 07:32:52.761620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.258 #61 NEW cov: 11889 ft: 15142 corp: 29/403b lim: 30 exec/s: 61 rss: 69Mb L: 16/26 MS: 1 ShuffleBytes- 00:07:42.258 [2024-11-28 07:32:52.811092] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.258 [2024-11-28 07:32:52.811279] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.258 [2024-11-28 07:32:52.811621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.258 [2024-11-28 07:32:52.811651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.258 [2024-11-28 07:32:52.811774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff3283ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.258 [2024-11-28 07:32:52.811793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.258 #62 NEW cov: 11889 ft: 15149 corp: 30/417b lim: 30 exec/s: 62 rss: 69Mb L: 14/26 MS: 1 InsertByte- 00:07:42.258 [2024-11-28 07:32:52.861132] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x92 00:07:42.258 [2024-11-28 07:32:52.861489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.258 [2024-11-28 07:32:52.861521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.258 #63 NEW cov: 11889 ft: 15160 corp: 31/423b lim: 30 exec/s: 63 rss: 69Mb L: 6/26 MS: 1 CopyPart- 00:07:42.258 [2024-11-28 07:32:52.911513] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.258 [2024-11-28 07:32:52.911692] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.259 [2024-11-28 07:32:52.911853] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.259 [2024-11-28 07:32:52.912019] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.259 [2024-11-28 07:32:52.912376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:52.912406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.259 [2024-11-28 07:32:52.912543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:52.912563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.259 [2024-11-28 07:32:52.912691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:52.912708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.259 [2024-11-28 07:32:52.912834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:52.912852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.259 #64 NEW cov: 11889 ft: 15174 corp: 32/450b lim: 30 exec/s: 64 rss: 69Mb L: 27/27 MS: 1 CopyPart- 00:07:42.259 [2024-11-28 07:32:52.961515] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:07:42.259 [2024-11-28 07:32:52.961693] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (51184) > buf size (4096) 00:07:42.259 [2024-11-28 07:32:52.962028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:52.962057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.259 [2024-11-28 07:32:52.962193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:31fb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:52.962213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.259 #65 NEW cov: 11889 ft: 15217 corp: 33/464b lim: 30 exec/s: 65 rss: 69Mb L: 14/27 MS: 1 ChangeBinInt- 00:07:42.259 [2024-11-28 07:32:53.021691] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x180e 00:07:42.259 [2024-11-28 07:32:53.021869] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786488) > buf size (4096) 00:07:42.259 [2024-11-28 07:32:53.022239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:009200fb cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:53.022268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.259 [2024-11-28 07:32:53.022410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000d8392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.259 [2024-11-28 07:32:53.022428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.519 #66 NEW cov: 11889 ft: 15222 corp: 34/480b lim: 30 exec/s: 66 rss: 69Mb L: 16/27 MS: 1 InsertByte- 00:07:42.519 [2024-11-28 07:32:53.081937] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:07:42.519 [2024-11-28 07:32:53.082113] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45040) > buf size (4096) 00:07:42.519 [2024-11-28 07:32:53.082486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.519 [2024-11-28 07:32:53.082513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 [2024-11-28 07:32:53.082629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2bfb00d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.519 [2024-11-28 07:32:53.082648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.519 #67 NEW cov: 11889 ft: 15257 corp: 35/494b lim: 30 exec/s: 33 rss: 69Mb L: 14/27 MS: 1 ChangeByte- 00:07:42.519 #67 DONE cov: 11889 ft: 15257 corp: 35/494b lim: 30 exec/s: 33 rss: 69Mb 00:07:42.519 ###### Recommended dictionary. ###### 00:07:42.519 "\000\222\373\330\030\000\250\016" # Uses: 4 00:07:42.519 ###### End of recommended dictionary. ###### 00:07:42.519 Done 67 runs in 2 second(s) 00:07:42.519 07:32:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:42.519 07:32:53 -- ../common.sh@72 -- # (( i++ )) 00:07:42.519 07:32:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.519 07:32:53 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:42.519 07:32:53 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:42.519 07:32:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.519 07:32:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.519 07:32:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:42.519 07:32:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:42.519 07:32:53 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:42.519 07:32:53 -- nvmf/run.sh@29 -- # port=4402 00:07:42.519 07:32:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:42.519 07:32:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:42.519 07:32:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.519 07:32:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:42.519 [2024-11-28 07:32:53.256925] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:42.519 [2024-11-28 07:32:53.257001] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660353 ] 00:07:42.778 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.778 [2024-11-28 07:32:53.432000] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.778 [2024-11-28 07:32:53.451219] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.778 [2024-11-28 07:32:53.451335] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.778 [2024-11-28 07:32:53.502528] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.778 [2024-11-28 07:32:53.518838] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:42.778 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.778 INFO: Seed: 404603490 00:07:43.037 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:43.037 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:43.037 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:43.037 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.037 #2 INITED exec/s: 0 rss: 59Mb 00:07:43.037 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.037 This may also happen if the target rejected all inputs we tried so far 00:07:43.037 [2024-11-28 07:32:53.595418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.037 [2024-11-28 07:32:53.595452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.037 [2024-11-28 07:32:53.595602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:75750075 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.037 [2024-11-28 07:32:53.595621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.296 NEW_FUNC[1/670]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:43.296 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.296 #17 NEW cov: 11584 ft: 11585 corp: 2/16b lim: 35 exec/s: 0 rss: 67Mb L: 15/15 MS: 5 CrossOver-ChangeByte-ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:43.296 [2024-11-28 07:32:53.905483] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.296 [2024-11-28 07:32:53.905917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.905983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.296 #19 NEW cov: 11706 ft: 12370 corp: 3/26b lim: 35 exec/s: 0 rss: 67Mb L: 10/15 MS: 2 InsertByte-CMP- DE: "\000\222\373\331\321\037\224("- 00:07:43.296 [2024-11-28 07:32:53.945969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.945997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.296 [2024-11-28 07:32:53.946114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:75750075 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.946133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.296 #20 NEW cov: 11712 ft: 12674 corp: 4/41b lim: 35 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 ShuffleBytes- 00:07:43.296 [2024-11-28 07:32:53.985778] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.296 [2024-11-28 07:32:53.985953] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.296 [2024-11-28 07:32:53.986113] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.296 [2024-11-28 07:32:53.986612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.986644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.296 [2024-11-28 07:32:53.986760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.986781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.296 [2024-11-28 07:32:53.986905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.986930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.296 [2024-11-28 07:32:53.987052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:28001f94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:53.987069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.296 #21 NEW cov: 11797 ft: 13443 corp: 5/70b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:43.296 [2024-11-28 07:32:54.036176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:74007474 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:54.036203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.296 [2024-11-28 07:32:54.036321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:74007474 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.296 [2024-11-28 07:32:54.036338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.296 #29 NEW cov: 11797 ft: 13536 corp: 6/89b lim: 35 exec/s: 0 rss: 67Mb L: 19/29 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:43.555 [2024-11-28 07:32:54.075895] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.555 [2024-11-28 07:32:54.076401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.076431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.555 [2024-11-28 07:32:54.076546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00920028 cdw11:d100fbd9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.076563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.555 #30 NEW cov: 11797 ft: 13594 corp: 7/107b lim: 35 exec/s: 0 rss: 67Mb L: 18/29 MS: 1 PersAutoDict- DE: "\000\222\373\331\321\037\224("- 00:07:43.555 [2024-11-28 07:32:54.116086] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.555 [2024-11-28 07:32:54.116258] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.555 [2024-11-28 07:32:54.116414] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.555 [2024-11-28 07:32:54.116899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.116935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.555 [2024-11-28 07:32:54.117052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.117072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.555 [2024-11-28 07:32:54.117186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.117207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.555 [2024-11-28 07:32:54.117333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:3f001f94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.117352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.555 #31 NEW cov: 11797 ft: 13663 corp: 8/136b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeByte- 00:07:43.555 [2024-11-28 07:32:54.167011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.555 [2024-11-28 07:32:54.167037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.167179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.167196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.167318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.167338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.167478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.167493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.556 #35 NEW cov: 11797 ft: 13718 corp: 9/168b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 4 ChangeByte-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:43.556 [2024-11-28 07:32:54.206421] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.556 [2024-11-28 07:32:54.206595] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.556 [2024-11-28 07:32:54.206749] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.556 [2024-11-28 07:32:54.207215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.207248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.207348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.207369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.207494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.207522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.207637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:3f003194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.207655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.556 #36 NEW cov: 11797 ft: 13804 corp: 10/197b lim: 35 exec/s: 0 rss: 67Mb L: 29/32 MS: 1 ChangeByte- 00:07:43.556 [2024-11-28 07:32:54.256738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.256765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.256882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:758c0075 cdw11:75008a75 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.256897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.556 #37 NEW cov: 11797 ft: 13904 corp: 11/212b lim: 35 exec/s: 0 rss: 67Mb L: 15/32 MS: 1 ChangeBinInt- 00:07:43.556 [2024-11-28 07:32:54.297000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:74007474 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.297026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.556 [2024-11-28 07:32:54.297146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:74007474 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.556 [2024-11-28 07:32:54.297164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.556 #38 NEW cov: 11797 ft: 13949 corp: 12/228b lim: 35 exec/s: 0 rss: 67Mb L: 16/32 MS: 1 EraseBytes- 00:07:43.816 [2024-11-28 07:32:54.337067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.337093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.337224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:758c0075 cdw11:0f008a75 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.337242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.816 #39 NEW cov: 11797 ft: 13993 corp: 13/243b lim: 35 exec/s: 0 rss: 68Mb L: 15/32 MS: 1 ChangeByte- 00:07:43.816 [2024-11-28 07:32:54.376741] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.816 [2024-11-28 07:32:54.377223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.377254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.377377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff6d00d7 cdw11:2e000426 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.377393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.816 #40 NEW cov: 11797 ft: 14078 corp: 14/261b lim: 35 exec/s: 0 rss: 68Mb L: 18/32 MS: 1 ChangeBinInt- 00:07:43.816 [2024-11-28 07:32:54.426969] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.816 [2024-11-28 07:32:54.427450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.427479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.427601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00920028 cdw11:d100fbd9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.427618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.816 #41 NEW cov: 11797 ft: 14136 corp: 15/279b lim: 35 exec/s: 0 rss: 68Mb L: 18/32 MS: 1 ChangeByte- 00:07:43.816 [2024-11-28 07:32:54.466902] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.816 [2024-11-28 07:32:54.467240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:920a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.467272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.816 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.816 #42 NEW cov: 11820 ft: 14168 corp: 16/289b lim: 35 exec/s: 0 rss: 68Mb L: 10/32 MS: 1 ChangeBinInt- 00:07:43.816 [2024-11-28 07:32:54.507299] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.816 [2024-11-28 07:32:54.507465] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.816 [2024-11-28 07:32:54.507621] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.816 [2024-11-28 07:32:54.508089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.508124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.508242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.508268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.508386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.508408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.508529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:28001f94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.508546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.816 #43 NEW cov: 11820 ft: 14177 corp: 17/318b lim: 35 exec/s: 0 rss: 68Mb L: 29/32 MS: 1 ChangeBinInt- 00:07:43.816 [2024-11-28 07:32:54.547658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.547685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.816 [2024-11-28 07:32:54.547836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:75750075 cdw11:75007675 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.816 [2024-11-28 07:32:54.547851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.816 #44 NEW cov: 11820 ft: 14201 corp: 18/333b lim: 35 exec/s: 44 rss: 68Mb L: 15/32 MS: 1 ChangeBinInt- 00:07:44.075 [2024-11-28 07:32:54.587966] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.075 [2024-11-28 07:32:54.588310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.075 [2024-11-28 07:32:54.588337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.075 [2024-11-28 07:32:54.588468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.075 [2024-11-28 07:32:54.588484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.075 [2024-11-28 07:32:54.588614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.075 [2024-11-28 07:32:54.588632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.588753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.588774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.076 #45 NEW cov: 11820 ft: 14229 corp: 19/365b lim: 35 exec/s: 45 rss: 68Mb L: 32/32 MS: 1 PersAutoDict- DE: "\000\222\373\331\321\037\224("- 00:07:44.076 [2024-11-28 07:32:54.627697] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.628455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.628489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.628614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a290028 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.628632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.628753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:12120012 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.628769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.628885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:12120012 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.628900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.076 #46 NEW cov: 11820 ft: 14260 corp: 20/397b lim: 35 exec/s: 46 rss: 68Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:44.076 [2024-11-28 07:32:54.667727] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.667896] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.668046] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.668514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.668547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.668665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.668690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.668810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.668831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.668953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:28001f94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.668972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.076 #47 NEW cov: 11820 ft: 14334 corp: 21/426b lim: 35 exec/s: 47 rss: 68Mb L: 29/32 MS: 1 ChangeBinInt- 00:07:44.076 [2024-11-28 07:32:54.707635] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.707998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.708032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.076 #48 NEW cov: 11820 ft: 14342 corp: 22/436b lim: 35 exec/s: 48 rss: 68Mb L: 10/32 MS: 1 ShuffleBytes- 00:07:44.076 [2024-11-28 07:32:54.748049] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.748559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:00007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.748586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.748709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.748732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.748855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:75750075 cdw11:75008c8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.748872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.076 #49 NEW cov: 11820 ft: 14513 corp: 23/459b lim: 35 exec/s: 49 rss: 68Mb L: 23/32 MS: 1 InsertRepeatedBytes- 00:07:44.076 [2024-11-28 07:32:54.787978] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.788442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.788471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.788596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00920028 cdw11:d100dbd9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.788615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.076 #50 NEW cov: 11820 ft: 14548 corp: 24/477b lim: 35 exec/s: 50 rss: 68Mb L: 18/32 MS: 1 ChangeBit- 00:07:44.076 [2024-11-28 07:32:54.828037] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.076 [2024-11-28 07:32:54.828515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f000092 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.828546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.076 [2024-11-28 07:32:54.828665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d1280094 cdw11:d100fbd9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.076 [2024-11-28 07:32:54.828682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.335 #51 NEW cov: 11820 ft: 14553 corp: 25/495b lim: 35 exec/s: 51 rss: 68Mb L: 18/32 MS: 1 ShuffleBytes- 00:07:44.335 [2024-11-28 07:32:54.868714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.868741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.335 [2024-11-28 07:32:54.868861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:758c0075 cdw11:8d008a75 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.868878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.335 [2024-11-28 07:32:54.908753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.908779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.335 [2024-11-28 07:32:54.908903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:758c0075 cdw11:8d008a75 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.908918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.335 #53 NEW cov: 11820 ft: 14625 corp: 26/510b lim: 35 exec/s: 53 rss: 68Mb L: 15/32 MS: 2 ChangeBinInt-CopyPart- 00:07:44.335 [2024-11-28 07:32:54.948655] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.335 [2024-11-28 07:32:54.948816] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.335 [2024-11-28 07:32:54.948958] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.335 [2024-11-28 07:32:54.949423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.949456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.335 [2024-11-28 07:32:54.949584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.949603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.335 [2024-11-28 07:32:54.949741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.949764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.335 [2024-11-28 07:32:54.949894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:28001f94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.335 [2024-11-28 07:32:54.949910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.336 #54 NEW cov: 11820 ft: 14670 corp: 27/539b lim: 35 exec/s: 54 rss: 68Mb L: 29/32 MS: 1 ShuffleBytes- 00:07:44.336 [2024-11-28 07:32:54.988792] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:54.988960] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:54.989106] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:54.989263] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:54.989769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:54.989802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:54.989930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:54.989955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:54.990081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:f6f60000 cdw11:f600f6f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:54.990107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:54.990230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:54.990250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:54.990385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:d13100d9 cdw11:0a00943f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:54.990403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.336 #55 NEW cov: 11820 ft: 14773 corp: 28/574b lim: 35 exec/s: 55 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:44.336 [2024-11-28 07:32:55.038860] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:55.039020] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:55.039179] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:55.039520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:55.039554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:55.039686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:55.039705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:55.039817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:55.039838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.336 #56 NEW cov: 11820 ft: 14800 corp: 29/598b lim: 35 exec/s: 56 rss: 68Mb L: 24/35 MS: 1 EraseBytes- 00:07:44.336 [2024-11-28 07:32:55.078872] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.336 [2024-11-28 07:32:55.079391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:55.079421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.336 [2024-11-28 07:32:55.079538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a290028 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.336 [2024-11-28 07:32:55.079554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.595 #57 NEW cov: 11820 ft: 14831 corp: 30/615b lim: 35 exec/s: 57 rss: 68Mb L: 17/35 MS: 1 EraseBytes- 00:07:44.595 [2024-11-28 07:32:55.119421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.119447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.119578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:759c0075 cdw11:8d008a75 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.119596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.595 #58 NEW cov: 11820 ft: 14862 corp: 31/630b lim: 35 exec/s: 58 rss: 68Mb L: 15/35 MS: 1 ChangeBit- 00:07:44.595 [2024-11-28 07:32:55.159466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:74740074 cdw11:7400747c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.159492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.159612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:74740074 cdw11:74007474 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.159628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.595 #59 NEW cov: 11820 ft: 14874 corp: 32/649b lim: 35 exec/s: 59 rss: 68Mb L: 19/35 MS: 1 ChangeBit- 00:07:44.595 [2024-11-28 07:32:55.200000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.200026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.200165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:54490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.200181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.200312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.200328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.200458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.200475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.595 #60 NEW cov: 11820 ft: 14882 corp: 33/681b lim: 35 exec/s: 60 rss: 68Mb L: 32/35 MS: 1 ChangeByte- 00:07:44.595 [2024-11-28 07:32:55.239471] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.595 [2024-11-28 07:32:55.240207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:92fb0000 cdw11:1f00d9d1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.240240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.240371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a290028 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.240388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.240511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:12120012 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.240529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.595 [2024-11-28 07:32:55.240657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:12120012 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.595 [2024-11-28 07:32:55.240679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.595 #61 NEW cov: 11820 ft: 14887 corp: 34/714b lim: 35 exec/s: 61 rss: 68Mb L: 33/35 MS: 1 InsertByte- 00:07:44.595 [2024-11-28 07:32:55.279662] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.595 [2024-11-28 07:32:55.279826] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.595 [2024-11-28 07:32:55.279990] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.595 [2024-11-28 07:32:55.280440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.280471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.280602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.280626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.280750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.280772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.280903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:3f003194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.280919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.596 #62 NEW cov: 11820 ft: 14905 corp: 35/743b lim: 35 exec/s: 62 rss: 68Mb L: 29/35 MS: 1 ShuffleBytes- 00:07:44.596 [2024-11-28 07:32:55.320007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:75750035 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.320033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.320152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:75750075 cdw11:75007575 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.320169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.596 #63 NEW cov: 11820 ft: 14910 corp: 36/759b lim: 35 exec/s: 63 rss: 69Mb L: 16/35 MS: 1 CrossOver- 00:07:44.596 [2024-11-28 07:32:55.360594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.360624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.360745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.360763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.360886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.360903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.596 [2024-11-28 07:32:55.361022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.596 [2024-11-28 07:32:55.361039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.854 #64 NEW cov: 11820 ft: 14917 corp: 37/791b lim: 35 exec/s: 64 rss: 69Mb L: 32/35 MS: 1 ShuffleBytes- 00:07:44.854 [2024-11-28 07:32:55.399964] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.854 [2024-11-28 07:32:55.400134] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.854 [2024-11-28 07:32:55.400302] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.854 [2024-11-28 07:32:55.400782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.854 [2024-11-28 07:32:55.400813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.854 [2024-11-28 07:32:55.400938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.854 [2024-11-28 07:32:55.400959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.854 [2024-11-28 07:32:55.401085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.854 [2024-11-28 07:32:55.401107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.854 [2024-11-28 07:32:55.401245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d9d100fb cdw11:28001f94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.854 [2024-11-28 07:32:55.401262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.854 #65 NEW cov: 11820 ft: 14923 corp: 38/820b lim: 35 exec/s: 65 rss: 69Mb L: 29/35 MS: 1 CopyPart- 00:07:44.854 [2024-11-28 07:32:55.450465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f0000d1 cdw11:d90092fb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.854 [2024-11-28 07:32:55.450493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.450622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9428001f cdw11:0a002894 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.450641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 #67 NEW cov: 11820 ft: 14926 corp: 39/834b lim: 35 exec/s: 67 rss: 69Mb L: 14/35 MS: 2 EraseBytes-PersAutoDict- DE: "\000\222\373\331\321\037\224("- 00:07:44.855 [2024-11-28 07:32:55.490967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.490996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.491115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.491148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.491267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:60490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.491285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.491404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.491422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.855 #68 NEW cov: 11820 ft: 14928 corp: 40/866b lim: 35 exec/s: 68 rss: 69Mb L: 32/35 MS: 1 ChangeByte- 00:07:44.855 [2024-11-28 07:32:55.540626] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.855 [2024-11-28 07:32:55.540802] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.855 [2024-11-28 07:32:55.541125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:49490049 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.541154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.541283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:49000049 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.541306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.541430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.541450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.855 [2024-11-28 07:32:55.541575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:49004949 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.855 [2024-11-28 07:32:55.541602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.855 #69 NEW cov: 11820 ft: 14954 corp: 41/898b lim: 35 exec/s: 34 rss: 69Mb L: 32/35 MS: 1 CrossOver- 00:07:44.855 #69 DONE cov: 11820 ft: 14954 corp: 41/898b lim: 35 exec/s: 34 rss: 69Mb 00:07:44.855 ###### Recommended dictionary. ###### 00:07:44.855 "\000\222\373\331\321\037\224(" # Uses: 3 00:07:44.855 ###### End of recommended dictionary. ###### 00:07:44.855 Done 69 runs in 2 second(s) 00:07:45.114 07:32:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:45.114 07:32:55 -- ../common.sh@72 -- # (( i++ )) 00:07:45.114 07:32:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.114 07:32:55 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:45.114 07:32:55 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:45.114 07:32:55 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.114 07:32:55 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.114 07:32:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:45.114 07:32:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:45.114 07:32:55 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:45.114 07:32:55 -- nvmf/run.sh@29 -- # port=4403 00:07:45.114 07:32:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:45.114 07:32:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:45.114 07:32:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.114 07:32:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:45.114 [2024-11-28 07:32:55.724334] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:45.114 [2024-11-28 07:32:55.724420] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660394 ] 00:07:45.114 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.373 [2024-11-28 07:32:55.900645] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.373 [2024-11-28 07:32:55.920433] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.373 [2024-11-28 07:32:55.920550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.373 [2024-11-28 07:32:55.972395] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.373 [2024-11-28 07:32:55.988732] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:45.373 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.373 INFO: Seed: 2874601659 00:07:45.373 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:45.373 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:45.373 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:45.373 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.373 #2 INITED exec/s: 0 rss: 59Mb 00:07:45.373 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.373 This may also happen if the target rejected all inputs we tried so far 00:07:45.373 [2024-11-28 07:32:56.044082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.373 [2024-11-28 07:32:56.044113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.631 NEW_FUNC[1/676]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:45.631 NEW_FUNC[2/676]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.631 #7 NEW cov: 11738 ft: 11739 corp: 2/11b lim: 20 exec/s: 0 rss: 67Mb L: 10/10 MS: 5 ChangeByte-ChangeByte-ShuffleBytes-CopyPart-CMP- DE: "\001\000\000\000\000\000\000?"- 00:07:45.631 #12 NEW cov: 11851 ft: 12542 corp: 3/15b lim: 20 exec/s: 0 rss: 67Mb L: 4/10 MS: 5 InsertByte-ShuffleBytes-InsertByte-CopyPart-InsertByte- 00:07:45.631 [2024-11-28 07:32:56.384948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.631 [2024-11-28 07:32:56.384983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.890 #13 NEW cov: 11857 ft: 12810 corp: 4/25b lim: 20 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:45.890 #14 NEW cov: 11942 ft: 13176 corp: 5/29b lim: 20 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 CrossOver- 00:07:45.890 #15 NEW cov: 11942 ft: 13248 corp: 6/33b lim: 20 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:45.890 NEW_FUNC[1/2]: 0x1292e78 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:773 00:07:45.890 NEW_FUNC[2/2]: 0x12b3f38 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3493 00:07:45.890 #16 NEW cov: 11999 ft: 13447 corp: 7/42b lim: 20 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000?"- 00:07:45.890 [2024-11-28 07:32:56.545570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.890 [2024-11-28 07:32:56.545603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.890 #22 NEW cov: 12003 ft: 13846 corp: 8/55b lim: 20 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:07:45.890 [2024-11-28 07:32:56.595563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.890 [2024-11-28 07:32:56.595590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.890 #23 NEW cov: 12003 ft: 13867 corp: 9/64b lim: 20 exec/s: 0 rss: 67Mb L: 9/13 MS: 1 InsertRepeatedBytes- 00:07:45.890 #26 NEW cov: 12020 ft: 14034 corp: 10/82b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:46.149 #27 NEW cov: 12020 ft: 14060 corp: 11/86b lim: 20 exec/s: 0 rss: 67Mb L: 4/18 MS: 1 ChangeBit- 00:07:46.149 #28 NEW cov: 12020 ft: 14088 corp: 12/95b lim: 20 exec/s: 0 rss: 68Mb L: 9/18 MS: 1 ChangeByte- 00:07:46.149 #29 NEW cov: 12020 ft: 14134 corp: 13/107b lim: 20 exec/s: 0 rss: 68Mb L: 12/18 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000?"- 00:07:46.149 #30 NEW cov: 12020 ft: 14149 corp: 14/112b lim: 20 exec/s: 0 rss: 68Mb L: 5/18 MS: 1 InsertByte- 00:07:46.149 #31 NEW cov: 12020 ft: 14165 corp: 15/131b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 InsertByte- 00:07:46.149 #32 NEW cov: 12020 ft: 14181 corp: 16/140b lim: 20 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 ChangeBinInt- 00:07:46.407 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.408 #33 NEW cov: 12043 ft: 14208 corp: 17/149b lim: 20 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 ChangeBinInt- 00:07:46.408 #34 NEW cov: 12043 ft: 14224 corp: 18/153b lim: 20 exec/s: 0 rss: 68Mb L: 4/19 MS: 1 CopyPart- 00:07:46.408 #35 NEW cov: 12043 ft: 14354 corp: 19/161b lim: 20 exec/s: 0 rss: 68Mb L: 8/19 MS: 1 CrossOver- 00:07:46.408 #36 NEW cov: 12043 ft: 14433 corp: 20/173b lim: 20 exec/s: 36 rss: 68Mb L: 12/19 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000?"- 00:07:46.408 #37 NEW cov: 12043 ft: 14483 corp: 21/183b lim: 20 exec/s: 37 rss: 68Mb L: 10/19 MS: 1 InsertByte- 00:07:46.408 #38 NEW cov: 12043 ft: 14516 corp: 22/191b lim: 20 exec/s: 38 rss: 68Mb L: 8/19 MS: 1 EraseBytes- 00:07:46.408 [2024-11-28 07:32:57.157417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.408 [2024-11-28 07:32:57.157445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.408 NEW_FUNC[1/1]: 0x133cae8 in _nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3438 00:07:46.408 #40 NEW cov: 12070 ft: 14609 corp: 23/210b lim: 20 exec/s: 40 rss: 68Mb L: 19/19 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:46.666 #42 NEW cov: 12070 ft: 14628 corp: 24/214b lim: 20 exec/s: 42 rss: 68Mb L: 4/19 MS: 2 EraseBytes-InsertByte- 00:07:46.666 [2024-11-28 07:32:57.237646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.666 [2024-11-28 07:32:57.237674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.666 #43 NEW cov: 12070 ft: 14683 corp: 25/231b lim: 20 exec/s: 43 rss: 68Mb L: 17/19 MS: 1 CopyPart- 00:07:46.666 [2024-11-28 07:32:57.287922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.666 [2024-11-28 07:32:57.287951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.666 #44 NEW cov: 12070 ft: 14769 corp: 26/251b lim: 20 exec/s: 44 rss: 68Mb L: 20/20 MS: 1 CrossOver- 00:07:46.666 [2024-11-28 07:32:57.337653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.666 [2024-11-28 07:32:57.337681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.666 #45 NEW cov: 12070 ft: 14788 corp: 27/261b lim: 20 exec/s: 45 rss: 68Mb L: 10/20 MS: 1 ChangeByte- 00:07:46.666 #46 NEW cov: 12070 ft: 14801 corp: 28/269b lim: 20 exec/s: 46 rss: 68Mb L: 8/20 MS: 1 CopyPart- 00:07:46.926 #47 NEW cov: 12070 ft: 14804 corp: 29/283b lim: 20 exec/s: 47 rss: 68Mb L: 14/20 MS: 1 InsertRepeatedBytes- 00:07:46.926 [2024-11-28 07:32:57.458315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.926 [2024-11-28 07:32:57.458343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.926 #48 NEW cov: 12070 ft: 14827 corp: 30/300b lim: 20 exec/s: 48 rss: 69Mb L: 17/20 MS: 1 CrossOver- 00:07:46.926 [2024-11-28 07:32:57.498318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.926 [2024-11-28 07:32:57.498345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.926 #49 NEW cov: 12070 ft: 14879 corp: 31/313b lim: 20 exec/s: 49 rss: 69Mb L: 13/20 MS: 1 ChangeByte- 00:07:46.926 #50 NEW cov: 12070 ft: 14904 corp: 32/320b lim: 20 exec/s: 50 rss: 69Mb L: 7/20 MS: 1 CrossOver- 00:07:46.926 [2024-11-28 07:32:57.588746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.926 [2024-11-28 07:32:57.588772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.926 #51 NEW cov: 12070 ft: 14982 corp: 33/340b lim: 20 exec/s: 51 rss: 69Mb L: 20/20 MS: 1 ChangeByte- 00:07:46.926 #52 NEW cov: 12070 ft: 14989 corp: 34/360b lim: 20 exec/s: 52 rss: 69Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:47.185 #53 NEW cov: 12070 ft: 15001 corp: 35/371b lim: 20 exec/s: 53 rss: 69Mb L: 11/20 MS: 1 InsertRepeatedBytes- 00:07:47.185 #54 NEW cov: 12070 ft: 15026 corp: 36/381b lim: 20 exec/s: 54 rss: 69Mb L: 10/20 MS: 1 ChangeBinInt- 00:07:47.185 #55 NEW cov: 12070 ft: 15049 corp: 37/386b lim: 20 exec/s: 55 rss: 69Mb L: 5/20 MS: 1 InsertByte- 00:07:47.185 #56 NEW cov: 12070 ft: 15052 corp: 38/394b lim: 20 exec/s: 56 rss: 69Mb L: 8/20 MS: 1 CopyPart- 00:07:47.185 #57 NEW cov: 12070 ft: 15070 corp: 39/402b lim: 20 exec/s: 57 rss: 69Mb L: 8/20 MS: 1 InsertRepeatedBytes- 00:07:47.185 #58 NEW cov: 12070 ft: 15081 corp: 40/412b lim: 20 exec/s: 58 rss: 69Mb L: 10/20 MS: 1 ChangeBinInt- 00:07:47.185 [2024-11-28 07:32:57.909542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.185 [2024-11-28 07:32:57.909569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.185 #59 NEW cov: 12070 ft: 15092 corp: 41/431b lim: 20 exec/s: 59 rss: 69Mb L: 19/20 MS: 1 EraseBytes- 00:07:47.461 #60 NEW cov: 12070 ft: 15163 corp: 42/437b lim: 20 exec/s: 60 rss: 69Mb L: 6/20 MS: 1 EraseBytes- 00:07:47.461 [2024-11-28 07:32:57.999786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.461 [2024-11-28 07:32:57.999816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.461 #61 NEW cov: 12070 ft: 15176 corp: 43/450b lim: 20 exec/s: 30 rss: 69Mb L: 13/20 MS: 1 ChangeBit- 00:07:47.461 #61 DONE cov: 12070 ft: 15176 corp: 43/450b lim: 20 exec/s: 30 rss: 69Mb 00:07:47.461 ###### Recommended dictionary. ###### 00:07:47.461 "\001\000\000\000\000\000\000?" # Uses: 3 00:07:47.461 ###### End of recommended dictionary. ###### 00:07:47.461 Done 61 runs in 2 second(s) 00:07:47.461 07:32:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:47.461 07:32:58 -- ../common.sh@72 -- # (( i++ )) 00:07:47.461 07:32:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.461 07:32:58 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:47.461 07:32:58 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:47.461 07:32:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.461 07:32:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.461 07:32:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:47.461 07:32:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:47.461 07:32:58 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:47.461 07:32:58 -- nvmf/run.sh@29 -- # port=4404 00:07:47.461 07:32:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:47.461 07:32:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:47.461 07:32:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.461 07:32:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:47.461 [2024-11-28 07:32:58.180262] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:47.461 [2024-11-28 07:32:58.180340] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660442 ] 00:07:47.461 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.814 [2024-11-28 07:32:58.356094] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.814 [2024-11-28 07:32:58.375341] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.814 [2024-11-28 07:32:58.375455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.814 [2024-11-28 07:32:58.426717] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.814 [2024-11-28 07:32:58.443082] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:47.814 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.814 INFO: Seed: 1033653191 00:07:47.814 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:47.814 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:47.814 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:47.814 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.814 #2 INITED exec/s: 0 rss: 59Mb 00:07:47.814 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.814 This may also happen if the target rejected all inputs we tried so far 00:07:47.814 [2024-11-28 07:32:58.508438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.814 [2024-11-28 07:32:58.508468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.814 [2024-11-28 07:32:58.508521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.814 [2024-11-28 07:32:58.508535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.105 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:48.105 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.105 #10 NEW cov: 11589 ft: 11601 corp: 2/20b lim: 35 exec/s: 0 rss: 67Mb L: 19/19 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:48.105 [2024-11-28 07:32:58.820332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.105 [2024-11-28 07:32:58.820384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.105 [2024-11-28 07:32:58.820516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.105 [2024-11-28 07:32:58.820539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.105 #11 NEW cov: 11718 ft: 12394 corp: 3/40b lim: 35 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertByte- 00:07:48.105 [2024-11-28 07:32:58.869850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.105 [2024-11-28 07:32:58.869877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.105 [2024-11-28 07:32:58.869993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.105 [2024-11-28 07:32:58.870010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 #12 NEW cov: 11724 ft: 12645 corp: 4/59b lim: 35 exec/s: 0 rss: 67Mb L: 19/20 MS: 1 ChangeBit- 00:07:48.363 [2024-11-28 07:32:58.910305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.910331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:58.910453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.910471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 #14 NEW cov: 11809 ft: 12835 corp: 5/73b lim: 35 exec/s: 0 rss: 67Mb L: 14/20 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:48.363 [2024-11-28 07:32:58.950923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.950951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:58.951065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.951081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:58.951192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.951208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:58.951321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.951339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.363 #15 NEW cov: 11809 ft: 13222 corp: 6/103b lim: 35 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:48.363 [2024-11-28 07:32:58.990583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e3e3e3e3 cdw11:e3e70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.990612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:58.990744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:58.990760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 #16 NEW cov: 11809 ft: 13279 corp: 7/117b lim: 35 exec/s: 0 rss: 67Mb L: 14/30 MS: 1 ChangeBit- 00:07:48.363 [2024-11-28 07:32:59.030712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:59.030738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:59.030854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:59.030870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 #17 NEW cov: 11809 ft: 13413 corp: 8/137b lim: 35 exec/s: 0 rss: 67Mb L: 20/30 MS: 1 CrossOver- 00:07:48.363 [2024-11-28 07:32:59.070333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:59.070360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:59.070477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:59.070493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 #18 NEW cov: 11809 ft: 13460 corp: 9/151b lim: 35 exec/s: 0 rss: 67Mb L: 14/30 MS: 1 ShuffleBytes- 00:07:48.363 [2024-11-28 07:32:59.110511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:59.110537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.363 [2024-11-28 07:32:59.110658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.363 [2024-11-28 07:32:59.110675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.363 #19 NEW cov: 11809 ft: 13506 corp: 10/170b lim: 35 exec/s: 0 rss: 67Mb L: 19/30 MS: 1 CMP- DE: "\001\000\002\000"- 00:07:48.621 [2024-11-28 07:32:59.161766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.161794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.621 [2024-11-28 07:32:59.161914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:86018686 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.161933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.621 [2024-11-28 07:32:59.162057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.162076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.621 [2024-11-28 07:32:59.162180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.162199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.621 #20 NEW cov: 11809 ft: 13663 corp: 11/200b lim: 35 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:07:48.621 [2024-11-28 07:32:59.221070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.221095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.621 #21 NEW cov: 11809 ft: 14401 corp: 12/213b lim: 35 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 EraseBytes- 00:07:48.621 [2024-11-28 07:32:59.260947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.260972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.621 [2024-11-28 07:32:59.261089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.261106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.621 #22 NEW cov: 11809 ft: 14413 corp: 13/233b lim: 35 exec/s: 0 rss: 67Mb L: 20/30 MS: 1 ShuffleBytes- 00:07:48.621 [2024-11-28 07:32:59.311571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:020001fd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.311602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.621 [2024-11-28 07:32:59.311735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.311752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.621 #23 NEW cov: 11809 ft: 14432 corp: 14/252b lim: 35 exec/s: 0 rss: 67Mb L: 19/30 MS: 1 ChangeByte- 00:07:48.621 [2024-11-28 07:32:59.371743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000071e8 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.371770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.621 [2024-11-28 07:32:59.371865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.621 [2024-11-28 07:32:59.371882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.879 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.879 #24 NEW cov: 11832 ft: 14483 corp: 15/272b lim: 35 exec/s: 0 rss: 68Mb L: 20/30 MS: 1 InsertByte- 00:07:48.879 [2024-11-28 07:32:59.421914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.421941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.422065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:feff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.422082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.879 #25 NEW cov: 11832 ft: 14505 corp: 16/291b lim: 35 exec/s: 0 rss: 68Mb L: 19/30 MS: 1 ChangeBinInt- 00:07:48.879 [2024-11-28 07:32:59.461976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02000100 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.462004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.462136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.462155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.462281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.462298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.879 #26 NEW cov: 11832 ft: 14718 corp: 17/314b lim: 35 exec/s: 26 rss: 68Mb L: 23/30 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:07:48.879 [2024-11-28 07:32:59.522253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.522280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.522395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.522414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.879 #27 NEW cov: 11832 ft: 14720 corp: 18/333b lim: 35 exec/s: 27 rss: 68Mb L: 19/30 MS: 1 ShuffleBytes- 00:07:48.879 [2024-11-28 07:32:59.573017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.573045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.573193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.573211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.573324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.573341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.573473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.573489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.879 #28 NEW cov: 11832 ft: 14781 corp: 19/363b lim: 35 exec/s: 28 rss: 68Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:48.879 [2024-11-28 07:32:59.612613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.612653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.612792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:79797379 cdw11:79ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.612809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.879 [2024-11-28 07:32:59.612928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.879 [2024-11-28 07:32:59.612946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.880 [2024-11-28 07:32:59.613070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.880 [2024-11-28 07:32:59.613086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.880 #29 NEW cov: 11832 ft: 14807 corp: 20/393b lim: 35 exec/s: 29 rss: 68Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:49.138 [2024-11-28 07:32:59.663215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.663243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.663364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00420000 cdw11:42420002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.663397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.663521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.663542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.663672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.663689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.138 #30 NEW cov: 11832 ft: 14861 corp: 21/421b lim: 35 exec/s: 30 rss: 68Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:49.138 [2024-11-28 07:32:59.712802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.712829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.712954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.712972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.138 #31 NEW cov: 11832 ft: 14929 corp: 22/440b lim: 35 exec/s: 31 rss: 68Mb L: 19/30 MS: 1 ShuffleBytes- 00:07:49.138 [2024-11-28 07:32:59.752929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100e800 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.752954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.753075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.753093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.138 #32 NEW cov: 11832 ft: 14936 corp: 23/459b lim: 35 exec/s: 32 rss: 68Mb L: 19/30 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:07:49.138 [2024-11-28 07:32:59.803013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.803040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.803167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e300e3e3 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.803183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.138 #33 NEW cov: 11832 ft: 15005 corp: 24/475b lim: 35 exec/s: 33 rss: 68Mb L: 16/30 MS: 1 InsertRepeatedBytes- 00:07:49.138 [2024-11-28 07:32:59.843275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.843302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.843423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:86308686 cdw11:86000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.843440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.843564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.843581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.843709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.843727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.138 #34 NEW cov: 11832 ft: 15046 corp: 25/505b lim: 35 exec/s: 34 rss: 68Mb L: 30/30 MS: 1 ChangeByte- 00:07:49.138 [2024-11-28 07:32:59.883401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.883427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.883545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.883561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.883686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.883704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.138 [2024-11-28 07:32:59.883805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:07000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.138 [2024-11-28 07:32:59.883821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.138 #35 NEW cov: 11832 ft: 15061 corp: 26/535b lim: 35 exec/s: 35 rss: 68Mb L: 30/30 MS: 1 ChangeByte- 00:07:49.396 [2024-11-28 07:32:59.933113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100e800 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:32:59.933140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.396 #36 NEW cov: 11832 ft: 15088 corp: 27/544b lim: 35 exec/s: 36 rss: 68Mb L: 9/30 MS: 1 CrossOver- 00:07:49.396 [2024-11-28 07:32:59.983820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02000100 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:32:59.983847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.396 [2024-11-28 07:32:59.983971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00c70000 cdw11:839b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:32:59.983988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.396 [2024-11-28 07:32:59.984112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:9200ddfb cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:32:59.984131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.396 #37 NEW cov: 11832 ft: 15161 corp: 28/567b lim: 35 exec/s: 37 rss: 68Mb L: 23/30 MS: 1 CMP- DE: "\307\203\233\003\335\373\222\000"- 00:07:49.396 [2024-11-28 07:33:00.044120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100e800 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.044149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.396 [2024-11-28 07:33:00.044279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.044299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.396 [2024-11-28 07:33:00.044425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.044443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.396 #38 NEW cov: 11832 ft: 15237 corp: 29/594b lim: 35 exec/s: 38 rss: 68Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:49.396 [2024-11-28 07:33:00.094269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02000100 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.094296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.396 [2024-11-28 07:33:00.094444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00c70000 cdw11:839b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.094464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.396 [2024-11-28 07:33:00.094591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:9200cdfb cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.094614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.396 #39 NEW cov: 11832 ft: 15306 corp: 30/617b lim: 35 exec/s: 39 rss: 68Mb L: 23/30 MS: 1 ChangeBit- 00:07:49.396 [2024-11-28 07:33:00.143859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.396 [2024-11-28 07:33:00.143885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.654 #40 NEW cov: 11832 ft: 15334 corp: 31/627b lim: 35 exec/s: 40 rss: 68Mb L: 10/30 MS: 1 EraseBytes- 00:07:49.654 [2024-11-28 07:33:00.194794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:8686e886 cdw11:86860001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.194821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.194948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:86018686 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.194969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.195089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.195108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.195237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.195255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.654 #41 NEW cov: 11832 ft: 15437 corp: 32/659b lim: 35 exec/s: 41 rss: 68Mb L: 32/32 MS: 1 CopyPart- 00:07:49.654 [2024-11-28 07:33:00.254458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100e800 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.254484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.254608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.254644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.654 #42 NEW cov: 11832 ft: 15450 corp: 33/678b lim: 35 exec/s: 42 rss: 68Mb L: 19/32 MS: 1 ChangeBit- 00:07:49.654 [2024-11-28 07:33:00.304871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.304898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.305033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.305051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.305175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.305192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.654 #43 NEW cov: 11832 ft: 15472 corp: 34/703b lim: 35 exec/s: 43 rss: 68Mb L: 25/32 MS: 1 InsertRepeatedBytes- 00:07:49.654 [2024-11-28 07:33:00.354723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.354750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.354878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.354894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.654 #44 NEW cov: 11832 ft: 15485 corp: 35/722b lim: 35 exec/s: 44 rss: 68Mb L: 19/32 MS: 1 ChangeBinInt- 00:07:49.654 [2024-11-28 07:33:00.394622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100e800 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.394647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.394759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.394777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.654 [2024-11-28 07:33:00.394887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.654 [2024-11-28 07:33:00.394903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.654 #45 NEW cov: 11832 ft: 15492 corp: 36/749b lim: 35 exec/s: 45 rss: 68Mb L: 27/32 MS: 1 CopyPart- 00:07:49.913 [2024-11-28 07:33:00.445405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:e8000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.445434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.913 [2024-11-28 07:33:00.445552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.445569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.913 [2024-11-28 07:33:00.445692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.445710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.913 #46 NEW cov: 11832 ft: 15505 corp: 37/775b lim: 35 exec/s: 46 rss: 69Mb L: 26/32 MS: 1 InsertByte- 00:07:49.913 [2024-11-28 07:33:00.505727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.505754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.913 [2024-11-28 07:33:00.505885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.505901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.913 [2024-11-28 07:33:00.506012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e3e3e3e3 cdw11:e3e30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.506031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.913 [2024-11-28 07:33:00.506159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000e3e3 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.913 [2024-11-28 07:33:00.506175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.913 #47 NEW cov: 11832 ft: 15507 corp: 38/805b lim: 35 exec/s: 23 rss: 69Mb L: 30/32 MS: 1 CrossOver- 00:07:49.913 #47 DONE cov: 11832 ft: 15507 corp: 38/805b lim: 35 exec/s: 23 rss: 69Mb 00:07:49.913 ###### Recommended dictionary. ###### 00:07:49.913 "\001\000\002\000" # Uses: 3 00:07:49.913 "\307\203\233\003\335\373\222\000" # Uses: 0 00:07:49.913 ###### End of recommended dictionary. ###### 00:07:49.913 Done 47 runs in 2 second(s) 00:07:49.913 07:33:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:49.913 07:33:00 -- ../common.sh@72 -- # (( i++ )) 00:07:49.913 07:33:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.913 07:33:00 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:49.913 07:33:00 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:49.913 07:33:00 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.913 07:33:00 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.913 07:33:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:49.913 07:33:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:49.913 07:33:00 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:49.913 07:33:00 -- nvmf/run.sh@29 -- # port=4405 00:07:49.913 07:33:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:49.913 07:33:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:49.913 07:33:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.913 07:33:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:49.913 [2024-11-28 07:33:00.681618] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:49.913 [2024-11-28 07:33:00.681712] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660541 ] 00:07:50.172 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.172 [2024-11-28 07:33:00.860966] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.172 [2024-11-28 07:33:00.881695] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.172 [2024-11-28 07:33:00.881817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.172 [2024-11-28 07:33:00.933476] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.430 [2024-11-28 07:33:00.949865] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:50.430 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.430 INFO: Seed: 3540639512 00:07:50.430 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:50.430 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:50.430 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:50.430 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.430 #2 INITED exec/s: 0 rss: 60Mb 00:07:50.430 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.430 This may also happen if the target rejected all inputs we tried so far 00:07:50.430 [2024-11-28 07:33:00.994425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.430 [2024-11-28 07:33:00.994460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.687 NEW_FUNC[1/671]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:50.687 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.687 #3 NEW cov: 11615 ft: 11616 corp: 2/15b lim: 45 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:07:50.687 [2024-11-28 07:33:01.315153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.687 [2024-11-28 07:33:01.315190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.687 #24 NEW cov: 11729 ft: 12112 corp: 3/28b lim: 45 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 EraseBytes- 00:07:50.687 [2024-11-28 07:33:01.385245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.687 [2024-11-28 07:33:01.385277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.687 #25 NEW cov: 11735 ft: 12410 corp: 4/41b lim: 45 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 ShuffleBytes- 00:07:50.687 [2024-11-28 07:33:01.445409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.687 [2024-11-28 07:33:01.445440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.945 #26 NEW cov: 11820 ft: 12695 corp: 5/54b lim: 45 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 ChangeBit- 00:07:50.945 [2024-11-28 07:33:01.495544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.945 [2024-11-28 07:33:01.495576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.945 #27 NEW cov: 11820 ft: 12813 corp: 6/67b lim: 45 exec/s: 0 rss: 67Mb L: 13/14 MS: 1 ChangeByte- 00:07:50.945 [2024-11-28 07:33:01.555748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63633063 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.945 [2024-11-28 07:33:01.555780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.945 #28 NEW cov: 11820 ft: 12903 corp: 7/81b lim: 45 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 ChangeByte- 00:07:50.945 [2024-11-28 07:33:01.605832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a41 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.945 [2024-11-28 07:33:01.605864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.945 #34 NEW cov: 11820 ft: 12968 corp: 8/94b lim: 45 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 ChangeByte- 00:07:50.945 [2024-11-28 07:33:01.656072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63633063 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.945 [2024-11-28 07:33:01.656104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.945 [2024-11-28 07:33:01.656135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.945 [2024-11-28 07:33:01.656150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.945 #35 NEW cov: 11820 ft: 13764 corp: 9/116b lim: 45 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 CopyPart- 00:07:51.203 [2024-11-28 07:33:01.726158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.203 [2024-11-28 07:33:01.726190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.203 #36 NEW cov: 11820 ft: 13858 corp: 10/130b lim: 45 exec/s: 0 rss: 68Mb L: 14/22 MS: 1 InsertByte- 00:07:51.203 [2024-11-28 07:33:01.776277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.203 [2024-11-28 07:33:01.776309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.203 #37 NEW cov: 11820 ft: 13925 corp: 11/142b lim: 45 exec/s: 0 rss: 68Mb L: 12/22 MS: 1 EraseBytes- 00:07:51.203 [2024-11-28 07:33:01.826440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.203 [2024-11-28 07:33:01.826473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.203 #38 NEW cov: 11820 ft: 13941 corp: 12/153b lim: 45 exec/s: 0 rss: 68Mb L: 11/22 MS: 1 EraseBytes- 00:07:51.203 [2024-11-28 07:33:01.886610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.203 [2024-11-28 07:33:01.886642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.203 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.203 #39 NEW cov: 11837 ft: 13991 corp: 13/166b lim: 45 exec/s: 0 rss: 68Mb L: 13/22 MS: 1 ChangeBit- 00:07:51.203 [2024-11-28 07:33:01.967625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63633063 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.203 [2024-11-28 07:33:01.967653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.203 [2024-11-28 07:33:01.967709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.203 [2024-11-28 07:33:01.967724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.461 #45 NEW cov: 11837 ft: 14123 corp: 14/188b lim: 45 exec/s: 45 rss: 68Mb L: 22/22 MS: 1 ChangeBinInt- 00:07:51.461 [2024-11-28 07:33:02.017587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.017622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.461 #46 NEW cov: 11837 ft: 14157 corp: 15/198b lim: 45 exec/s: 46 rss: 68Mb L: 10/22 MS: 1 EraseBytes- 00:07:51.461 [2024-11-28 07:33:02.057863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63633063 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.057890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.461 [2024-11-28 07:33:02.057942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.057956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.461 #47 NEW cov: 11837 ft: 14200 corp: 16/220b lim: 45 exec/s: 47 rss: 68Mb L: 22/22 MS: 1 CMP- DE: "\377\377\377\365"- 00:07:51.461 [2024-11-28 07:33:02.097867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.097893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.461 #48 NEW cov: 11837 ft: 14329 corp: 17/234b lim: 45 exec/s: 48 rss: 68Mb L: 14/22 MS: 1 ShuffleBytes- 00:07:51.461 [2024-11-28 07:33:02.138084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:52630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.138110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.461 [2024-11-28 07:33:02.138164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63636163 cdw11:61630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.138178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.461 #49 NEW cov: 11837 ft: 14409 corp: 18/253b lim: 45 exec/s: 49 rss: 68Mb L: 19/22 MS: 1 CopyPart- 00:07:51.461 [2024-11-28 07:33:02.178009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.178035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.461 #50 NEW cov: 11837 ft: 14509 corp: 19/266b lim: 45 exec/s: 50 rss: 68Mb L: 13/22 MS: 1 ChangeBinInt- 00:07:51.461 [2024-11-28 07:33:02.218127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.461 [2024-11-28 07:33:02.218153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.719 #51 NEW cov: 11837 ft: 14538 corp: 20/276b lim: 45 exec/s: 51 rss: 68Mb L: 10/22 MS: 1 EraseBytes- 00:07:51.720 [2024-11-28 07:33:02.258447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.258474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 [2024-11-28 07:33:02.258527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:32630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.258540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.720 #52 NEW cov: 11837 ft: 14567 corp: 21/294b lim: 45 exec/s: 52 rss: 68Mb L: 18/22 MS: 1 CopyPart- 00:07:51.720 [2024-11-28 07:33:02.298421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.298450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 #53 NEW cov: 11837 ft: 14579 corp: 22/309b lim: 45 exec/s: 53 rss: 68Mb L: 15/22 MS: 1 InsertByte- 00:07:51.720 [2024-11-28 07:33:02.328491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.328516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 #54 NEW cov: 11837 ft: 14585 corp: 23/324b lim: 45 exec/s: 54 rss: 68Mb L: 15/22 MS: 1 CopyPart- 00:07:51.720 [2024-11-28 07:33:02.368608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.368633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 #55 NEW cov: 11837 ft: 14586 corp: 24/338b lim: 45 exec/s: 55 rss: 68Mb L: 14/22 MS: 1 ChangeBit- 00:07:51.720 [2024-11-28 07:33:02.409056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.409081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 [2024-11-28 07:33:02.409135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.409148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.720 [2024-11-28 07:33:02.409203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.409216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.720 #63 NEW cov: 11837 ft: 14840 corp: 25/365b lim: 45 exec/s: 63 rss: 68Mb L: 27/27 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:51.720 [2024-11-28 07:33:02.448840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.448866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 #64 NEW cov: 11837 ft: 14849 corp: 26/378b lim: 45 exec/s: 64 rss: 68Mb L: 13/27 MS: 1 ChangeByte- 00:07:51.720 [2024-11-28 07:33:02.489152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.489178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.720 [2024-11-28 07:33:02.489235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.720 [2024-11-28 07:33:02.489249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.977 #66 NEW cov: 11837 ft: 14853 corp: 27/396b lim: 45 exec/s: 66 rss: 68Mb L: 18/27 MS: 2 EraseBytes-CrossOver- 00:07:51.977 [2024-11-28 07:33:02.529070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9cee9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.977 [2024-11-28 07:33:02.529095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.977 #67 NEW cov: 11837 ft: 14888 corp: 28/406b lim: 45 exec/s: 67 rss: 68Mb L: 10/27 MS: 1 ChangeBinInt- 00:07:51.977 [2024-11-28 07:33:02.569207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.977 [2024-11-28 07:33:02.569232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.977 #68 NEW cov: 11837 ft: 14894 corp: 29/421b lim: 45 exec/s: 68 rss: 68Mb L: 15/27 MS: 1 ChangeBit- 00:07:51.977 [2024-11-28 07:33:02.609372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.977 [2024-11-28 07:33:02.609397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.977 #69 NEW cov: 11837 ft: 14939 corp: 30/436b lim: 45 exec/s: 69 rss: 69Mb L: 15/27 MS: 1 ShuffleBytes- 00:07:51.977 [2024-11-28 07:33:02.649438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.977 [2024-11-28 07:33:02.649463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.977 #70 NEW cov: 11837 ft: 15005 corp: 31/453b lim: 45 exec/s: 70 rss: 69Mb L: 17/27 MS: 1 PersAutoDict- DE: "\377\377\377\365"- 00:07:51.977 [2024-11-28 07:33:02.689627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e3630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.977 [2024-11-28 07:33:02.689653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.977 #71 NEW cov: 11837 ft: 15008 corp: 32/466b lim: 45 exec/s: 71 rss: 69Mb L: 13/27 MS: 1 ChangeBit- 00:07:51.977 [2024-11-28 07:33:02.729695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:f5630007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.977 [2024-11-28 07:33:02.729721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 #82 NEW cov: 11837 ft: 15021 corp: 33/483b lim: 45 exec/s: 82 rss: 69Mb L: 17/27 MS: 1 PersAutoDict- DE: "\377\377\377\365"- 00:07:52.235 [2024-11-28 07:33:02.769845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:11e70a47 cdw11:6de30007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.769870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 #83 NEW cov: 11837 ft: 15028 corp: 34/496b lim: 45 exec/s: 83 rss: 69Mb L: 13/27 MS: 1 CMP- DE: "G\021\347m\343\373\222\000"- 00:07:52.235 [2024-11-28 07:33:02.810297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:52630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.810323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 [2024-11-28 07:33:02.810379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:63616363 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.810393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.235 [2024-11-28 07:33:02.810445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:63616363 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.810459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.235 #84 NEW cov: 11837 ft: 15038 corp: 35/527b lim: 45 exec/s: 84 rss: 69Mb L: 31/31 MS: 1 CrossOver- 00:07:52.235 [2024-11-28 07:33:02.850090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:63630a63 cdw11:63320003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.850119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 #85 NEW cov: 11837 ft: 15042 corp: 36/537b lim: 45 exec/s: 85 rss: 69Mb L: 10/31 MS: 1 EraseBytes- 00:07:52.235 [2024-11-28 07:33:02.890188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:11ff0aff cdw11:f5630007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.890213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 #86 NEW cov: 11844 ft: 15074 corp: 37/554b lim: 45 exec/s: 86 rss: 69Mb L: 17/31 MS: 1 ChangeBinInt- 00:07:52.235 [2024-11-28 07:33:02.930294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e3630a63 cdw11:44630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.930320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 #87 NEW cov: 11844 ft: 15131 corp: 38/567b lim: 45 exec/s: 87 rss: 69Mb L: 13/31 MS: 1 ChangeByte- 00:07:52.235 [2024-11-28 07:33:02.970433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52630a63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.235 [2024-11-28 07:33:02.970458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.235 #88 NEW cov: 11844 ft: 15167 corp: 39/584b lim: 45 exec/s: 44 rss: 69Mb L: 17/31 MS: 1 ShuffleBytes- 00:07:52.235 #88 DONE cov: 11844 ft: 15167 corp: 39/584b lim: 45 exec/s: 44 rss: 69Mb 00:07:52.235 ###### Recommended dictionary. ###### 00:07:52.235 "\377\377\377\365" # Uses: 3 00:07:52.235 "G\021\347m\343\373\222\000" # Uses: 0 00:07:52.235 ###### End of recommended dictionary. ###### 00:07:52.235 Done 88 runs in 2 second(s) 00:07:52.494 07:33:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:52.494 07:33:03 -- ../common.sh@72 -- # (( i++ )) 00:07:52.494 07:33:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.494 07:33:03 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:52.494 07:33:03 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:52.494 07:33:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.494 07:33:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.494 07:33:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:52.494 07:33:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:52.494 07:33:03 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:52.494 07:33:03 -- nvmf/run.sh@29 -- # port=4406 00:07:52.494 07:33:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:52.494 07:33:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:52.494 07:33:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.494 07:33:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:52.494 [2024-11-28 07:33:03.154410] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:52.494 [2024-11-28 07:33:03.154494] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1660659 ] 00:07:52.494 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.752 [2024-11-28 07:33:03.334372] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.752 [2024-11-28 07:33:03.354166] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.752 [2024-11-28 07:33:03.354283] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.752 [2024-11-28 07:33:03.405780] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.752 [2024-11-28 07:33:03.422139] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:52.752 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.752 INFO: Seed: 1715681497 00:07:52.752 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:52.752 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:52.752 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:52.752 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.752 #2 INITED exec/s: 0 rss: 60Mb 00:07:52.752 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.752 This may also happen if the target rejected all inputs we tried so far 00:07:52.752 [2024-11-28 07:33:03.481038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:52.752 [2024-11-28 07:33:03.481067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.010 NEW_FUNC[1/669]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:53.010 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.010 #4 NEW cov: 11531 ft: 11532 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 ShuffleBytes-CopyPart- 00:07:53.266 [2024-11-28 07:33:03.792476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.792534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.792626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.792655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.792731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.792757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.792831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.792859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.792933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.792960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.266 #5 NEW cov: 11646 ft: 12461 corp: 3/13b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:53.266 [2024-11-28 07:33:03.841788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.841814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #7 NEW cov: 11652 ft: 12801 corp: 4/15b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 2 CrossOver-CopyPart- 00:07:53.266 [2024-11-28 07:33:03.871933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.871958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 #8 NEW cov: 11737 ft: 13051 corp: 5/17b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:53.266 [2024-11-28 07:33:03.912130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.912161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.912209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.912222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.266 #9 NEW cov: 11737 ft: 13370 corp: 6/21b lim: 10 exec/s: 0 rss: 67Mb L: 4/10 MS: 1 CrossOver- 00:07:53.266 [2024-11-28 07:33:03.952378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae8 cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.952403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.952455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e8e8 cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.952469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.266 [2024-11-28 07:33:03.952522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:53.266 [2024-11-28 07:33:03.952536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.267 #10 NEW cov: 11737 ft: 13546 corp: 7/27b lim: 10 exec/s: 0 rss: 67Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:07:53.267 [2024-11-28 07:33:03.992231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:53.267 [2024-11-28 07:33:03.992257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.267 #11 NEW cov: 11737 ft: 13602 corp: 8/29b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:53.267 [2024-11-28 07:33:04.032396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:53.267 [2024-11-28 07:33:04.032421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.524 #13 NEW cov: 11737 ft: 13646 corp: 9/31b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 2 CrossOver-InsertByte- 00:07:53.525 [2024-11-28 07:33:04.062535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.062560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.062613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.062626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.525 #14 NEW cov: 11737 ft: 13724 corp: 10/36b lim: 10 exec/s: 0 rss: 68Mb L: 5/10 MS: 1 EraseBytes- 00:07:53.525 [2024-11-28 07:33:04.102908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.102933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.102983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.102996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.103045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.103058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.103115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005660 cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.103128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.525 #15 NEW cov: 11737 ft: 13750 corp: 11/45b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CMP- DE: "\377\221\373\337iV`4"- 00:07:53.525 [2024-11-28 07:33:04.132786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.132812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.132863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.132876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.525 #16 NEW cov: 11737 ft: 13810 corp: 12/49b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 CrossOver- 00:07:53.525 [2024-11-28 07:33:04.172781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.172807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 #17 NEW cov: 11737 ft: 13832 corp: 13/51b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:53.525 [2024-11-28 07:33:04.212899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.212925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 #18 NEW cov: 11737 ft: 13857 corp: 14/53b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:53.525 [2024-11-28 07:33:04.253440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae8 cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.253466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.253519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e8e3 cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.253533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.253582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e3e3 cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.253596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.253652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000e3e8 cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.253665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.253718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.253731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.525 #19 NEW cov: 11737 ft: 13964 corp: 15/63b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:53.525 [2024-11-28 07:33:04.293247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.293275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.525 [2024-11-28 07:33:04.293324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.525 [2024-11-28 07:33:04.293340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.783 #20 NEW cov: 11737 ft: 13975 corp: 16/67b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:53.783 [2024-11-28 07:33:04.333587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000008ff cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.333617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.333669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.333683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.333738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.333752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.333804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005660 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.333818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.783 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.783 #21 NEW cov: 11760 ft: 14011 corp: 17/76b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 ChangeBit- 00:07:53.783 [2024-11-28 07:33:04.373557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000008df cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.373584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.373637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006956 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.373651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.373701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006034 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.373715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.783 #22 NEW cov: 11760 ft: 14039 corp: 18/82b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 EraseBytes- 00:07:53.783 [2024-11-28 07:33:04.413920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.413946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.413998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.414012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.414060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.414073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.414125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005660 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.414138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.414186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000034ff cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.414203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.783 #23 NEW cov: 11760 ft: 14083 corp: 19/92b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "\377\221\373\337iV`4"- 00:07:53.783 [2024-11-28 07:33:04.454002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.454027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.454077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.454090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.454139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.454152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.454202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005660 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.454215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.454265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000340a cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.454279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.783 #24 NEW cov: 11760 ft: 14127 corp: 20/102b lim: 10 exec/s: 24 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "\377\221\373\337iV`4"- 00:07:53.783 [2024-11-28 07:33:04.493929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.493954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.494005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e8e8 cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.494019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.783 [2024-11-28 07:33:04.494069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.494084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.783 #25 NEW cov: 11760 ft: 14158 corp: 21/108b lim: 10 exec/s: 25 rss: 68Mb L: 6/10 MS: 1 ChangeByte- 00:07:53.783 [2024-11-28 07:33:04.533772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:53.783 [2024-11-28 07:33:04.533798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.783 #26 NEW cov: 11760 ft: 14278 corp: 22/110b lim: 10 exec/s: 26 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:54.042 [2024-11-28 07:33:04.563967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.563993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.564046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.564059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.042 #27 NEW cov: 11760 ft: 14333 corp: 23/114b lim: 10 exec/s: 27 rss: 69Mb L: 4/10 MS: 1 EraseBytes- 00:07:54.042 [2024-11-28 07:33:04.604356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff91 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.604382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.604435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000fbdf cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.604449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.604498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006956 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.604512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.604552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006034 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.604565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.042 #33 NEW cov: 11760 ft: 14363 corp: 24/123b lim: 10 exec/s: 33 rss: 69Mb L: 9/10 MS: 1 PersAutoDict- DE: "\377\221\373\337iV`4"- 00:07:54.042 [2024-11-28 07:33:04.644569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.644595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.644656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.644670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.644709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.644723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.644773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005640 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.644786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.644837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000034ff cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.644851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.042 #34 NEW cov: 11760 ft: 14372 corp: 25/133b lim: 10 exec/s: 34 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:54.042 [2024-11-28 07:33:04.684377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.684403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.684456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.684470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.042 #35 NEW cov: 11760 ft: 14388 corp: 26/138b lim: 10 exec/s: 35 rss: 69Mb L: 5/10 MS: 1 InsertByte- 00:07:54.042 [2024-11-28 07:33:04.724376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fb02 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.724402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 #37 NEW cov: 11760 ft: 14404 corp: 27/140b lim: 10 exec/s: 37 rss: 69Mb L: 2/10 MS: 2 EraseBytes-CrossOver- 00:07:54.042 [2024-11-28 07:33:04.764481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.764506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 #38 NEW cov: 11760 ft: 14419 corp: 28/142b lim: 10 exec/s: 38 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:54.042 [2024-11-28 07:33:04.794681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000400 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.794705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.042 [2024-11-28 07:33:04.794756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.042 [2024-11-28 07:33:04.794769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 #39 NEW cov: 11760 ft: 14428 corp: 29/146b lim: 10 exec/s: 39 rss: 69Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:54.301 [2024-11-28 07:33:04.834966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.834991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.835043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.835056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.835106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.835119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.301 #40 NEW cov: 11760 ft: 14435 corp: 30/153b lim: 10 exec/s: 40 rss: 69Mb L: 7/10 MS: 1 EraseBytes- 00:07:54.301 [2024-11-28 07:33:04.875171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.875197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.875245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009141 cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.875258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.875306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.875319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.875369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005660 cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.875383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.301 #41 NEW cov: 11760 ft: 14459 corp: 31/162b lim: 10 exec/s: 41 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:07:54.301 [2024-11-28 07:33:04.915376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.915402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.915452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008d8d cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.915471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.915521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008d8d cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.915535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.915584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008d8d cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.915602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.915654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.915668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.301 #42 NEW cov: 11760 ft: 14498 corp: 32/172b lim: 10 exec/s: 42 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:54.301 [2024-11-28 07:33:04.955212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000490a cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.955238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.955287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.955301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 #43 NEW cov: 11760 ft: 14516 corp: 33/177b lim: 10 exec/s: 43 rss: 69Mb L: 5/10 MS: 1 InsertByte- 00:07:54.301 [2024-11-28 07:33:04.995299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.995324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:04.995377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.301 [2024-11-28 07:33:04.995390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 #44 NEW cov: 11760 ft: 14529 corp: 34/182b lim: 10 exec/s: 44 rss: 69Mb L: 5/10 MS: 1 CopyPart- 00:07:54.301 [2024-11-28 07:33:05.035430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.301 [2024-11-28 07:33:05.035456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.301 [2024-11-28 07:33:05.035508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:54.301 [2024-11-28 07:33:05.035522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.301 #45 NEW cov: 11760 ft: 14542 corp: 35/187b lim: 10 exec/s: 45 rss: 69Mb L: 5/10 MS: 1 ShuffleBytes- 00:07:54.558 [2024-11-28 07:33:05.075795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.075821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.075870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.075884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.075931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df56 cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.075947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.075996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006034 cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.076009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.558 #46 NEW cov: 11760 ft: 14552 corp: 36/195b lim: 10 exec/s: 46 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:07:54.558 [2024-11-28 07:33:05.115646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0c cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.115671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.115720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.115733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.558 #47 NEW cov: 11760 ft: 14558 corp: 37/199b lim: 10 exec/s: 47 rss: 69Mb L: 4/10 MS: 1 ChangeByte- 00:07:54.558 [2024-11-28 07:33:05.145656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.145681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 #48 NEW cov: 11760 ft: 14577 corp: 38/201b lim: 10 exec/s: 48 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:54.558 [2024-11-28 07:33:05.176068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.176093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.176143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.176157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.176205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000008ff cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.176219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.176269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.176283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.558 #49 NEW cov: 11760 ft: 14584 corp: 39/210b lim: 10 exec/s: 49 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:54.558 [2024-11-28 07:33:05.215875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.215900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 #50 NEW cov: 11760 ft: 14627 corp: 40/212b lim: 10 exec/s: 50 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:54.558 [2024-11-28 07:33:05.246142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.246167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.246218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e8e8 cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.246231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.246278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.246295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.558 #51 NEW cov: 11760 ft: 14647 corp: 41/218b lim: 10 exec/s: 51 rss: 69Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:54.558 [2024-11-28 07:33:05.286175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ad6 cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.286201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.558 [2024-11-28 07:33:05.286252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.286266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.558 #52 NEW cov: 11760 ft: 14653 corp: 42/223b lim: 10 exec/s: 52 rss: 69Mb L: 5/10 MS: 1 ChangeByte- 00:07:54.558 [2024-11-28 07:33:05.326195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.558 [2024-11-28 07:33:05.326222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.816 #53 NEW cov: 11760 ft: 14655 corp: 43/225b lim: 10 exec/s: 53 rss: 69Mb L: 2/10 MS: 1 InsertByte- 00:07:54.816 [2024-11-28 07:33:05.356734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007eff cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.356759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.356808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000091fb cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.356821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.356867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df69 cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.356881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.356929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005640 cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.356941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.356990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000034ff cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.357004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.816 #54 NEW cov: 11760 ft: 14685 corp: 44/235b lim: 10 exec/s: 54 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:54.816 [2024-11-28 07:33:05.396531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.396557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.396615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007e00 cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.396630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.436641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.436667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.816 [2024-11-28 07:33:05.436719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007e00 cdw11:00000000 00:07:54.816 [2024-11-28 07:33:05.436736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.817 #56 NEW cov: 11760 ft: 14696 corp: 45/239b lim: 10 exec/s: 56 rss: 69Mb L: 4/10 MS: 2 CMP-ShuffleBytes- DE: "~\000"- 00:07:54.817 [2024-11-28 07:33:05.477062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:54.817 [2024-11-28 07:33:05.477088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.817 [2024-11-28 07:33:05.477139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80a cdw11:00000000 00:07:54.817 [2024-11-28 07:33:05.477151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.817 [2024-11-28 07:33:05.477200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000008ff cdw11:00000000 00:07:54.817 [2024-11-28 07:33:05.477213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.817 [2024-11-28 07:33:05.477263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.817 [2024-11-28 07:33:05.477276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.817 [2024-11-28 07:33:05.477325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.817 [2024-11-28 07:33:05.477338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.817 #57 NEW cov: 11760 ft: 14698 corp: 46/249b lim: 10 exec/s: 28 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:54.817 #57 DONE cov: 11760 ft: 14698 corp: 46/249b lim: 10 exec/s: 28 rss: 70Mb 00:07:54.817 ###### Recommended dictionary. ###### 00:07:54.817 "\377\221\373\337iV`4" # Uses: 3 00:07:54.817 "~\000" # Uses: 0 00:07:54.817 ###### End of recommended dictionary. ###### 00:07:54.817 Done 57 runs in 2 second(s) 00:07:55.075 07:33:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:55.075 07:33:05 -- ../common.sh@72 -- # (( i++ )) 00:07:55.075 07:33:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.075 07:33:05 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:55.075 07:33:05 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:55.075 07:33:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.075 07:33:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.075 07:33:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:55.075 07:33:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:55.075 07:33:05 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:55.075 07:33:05 -- nvmf/run.sh@29 -- # port=4407 00:07:55.075 07:33:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:55.075 07:33:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:55.075 07:33:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.075 07:33:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:55.075 [2024-11-28 07:33:05.658483] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:55.075 [2024-11-28 07:33:05.658553] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661123 ] 00:07:55.075 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.075 [2024-11-28 07:33:05.830699] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.333 [2024-11-28 07:33:05.850481] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.333 [2024-11-28 07:33:05.850607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.333 [2024-11-28 07:33:05.901909] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.333 [2024-11-28 07:33:05.918257] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:55.333 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.333 INFO: Seed: 4211669097 00:07:55.333 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:55.333 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:55.333 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:55.333 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.333 #2 INITED exec/s: 0 rss: 59Mb 00:07:55.333 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.333 This may also happen if the target rejected all inputs we tried so far 00:07:55.333 [2024-11-28 07:33:05.967104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:55.333 [2024-11-28 07:33:05.967132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.333 [2024-11-28 07:33:05.967184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.333 [2024-11-28 07:33:05.967198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.591 NEW_FUNC[1/666]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:55.591 NEW_FUNC[2/666]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.591 #3 NEW cov: 11504 ft: 11504 corp: 2/5b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:55.591 [2024-11-28 07:33:06.268012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:55.591 [2024-11-28 07:33:06.268043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.591 [2024-11-28 07:33:06.268092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.591 [2024-11-28 07:33:06.268106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.592 [2024-11-28 07:33:06.268156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.268169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.592 [2024-11-28 07:33:06.268216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.268230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.592 NEW_FUNC[1/3]: 0x1537a68 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3790 00:07:55.592 NEW_FUNC[2/3]: 0x1707538 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1507 00:07:55.592 #6 NEW cov: 11646 ft: 12310 corp: 3/14b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:07:55.592 [2024-11-28 07:33:06.307722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6e cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.307749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.592 #8 NEW cov: 11652 ft: 12864 corp: 4/16b lim: 10 exec/s: 0 rss: 67Mb L: 2/9 MS: 2 ShuffleBytes-InsertByte- 00:07:55.592 [2024-11-28 07:33:06.348219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.348246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.592 [2024-11-28 07:33:06.348296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.348310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.592 [2024-11-28 07:33:06.348360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.348373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.592 [2024-11-28 07:33:06.348422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.592 [2024-11-28 07:33:06.348434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.850 #9 NEW cov: 11737 ft: 13138 corp: 5/25b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeByte- 00:07:55.850 [2024-11-28 07:33:06.388293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:55.850 [2024-11-28 07:33:06.388319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.850 [2024-11-28 07:33:06.388367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:55.850 [2024-11-28 07:33:06.388380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.850 [2024-11-28 07:33:06.388426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:55.850 [2024-11-28 07:33:06.388439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.850 [2024-11-28 07:33:06.388487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.850 [2024-11-28 07:33:06.388500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.850 #10 NEW cov: 11737 ft: 13190 corp: 6/34b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:55.850 [2024-11-28 07:33:06.428071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000026e cdw11:00000000 00:07:55.850 [2024-11-28 07:33:06.428096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.850 #11 NEW cov: 11737 ft: 13261 corp: 7/36b lim: 10 exec/s: 0 rss: 67Mb L: 2/9 MS: 1 ChangeBinInt- 00:07:55.851 [2024-11-28 07:33:06.468312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.468337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.851 [2024-11-28 07:33:06.468387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.468401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.851 #12 NEW cov: 11737 ft: 13327 corp: 8/40b lim: 10 exec/s: 0 rss: 67Mb L: 4/9 MS: 1 ShuffleBytes- 00:07:55.851 [2024-11-28 07:33:06.508420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.508449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.851 [2024-11-28 07:33:06.508500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.508513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.851 #13 NEW cov: 11737 ft: 13353 corp: 9/44b lim: 10 exec/s: 0 rss: 67Mb L: 4/9 MS: 1 CopyPart- 00:07:55.851 [2024-11-28 07:33:06.548428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.548454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.851 #14 NEW cov: 11737 ft: 13397 corp: 10/46b lim: 10 exec/s: 0 rss: 67Mb L: 2/9 MS: 1 CopyPart- 00:07:55.851 [2024-11-28 07:33:06.578488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a08 cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.578514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.851 #15 NEW cov: 11737 ft: 13427 corp: 11/48b lim: 10 exec/s: 0 rss: 67Mb L: 2/9 MS: 1 ChangeBit- 00:07:55.851 [2024-11-28 07:33:06.618697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:07:55.851 [2024-11-28 07:33:06.618723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.109 #16 NEW cov: 11737 ft: 13446 corp: 12/50b lim: 10 exec/s: 0 rss: 67Mb L: 2/9 MS: 1 InsertByte- 00:07:56.109 [2024-11-28 07:33:06.648792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.648817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.109 [2024-11-28 07:33:06.648866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.648879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.109 #17 NEW cov: 11737 ft: 13465 corp: 13/54b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 CopyPart- 00:07:56.109 [2024-11-28 07:33:06.689158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.689183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.109 [2024-11-28 07:33:06.689236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002100 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.689249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.109 [2024-11-28 07:33:06.689300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.689313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.109 [2024-11-28 07:33:06.689364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.689377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.109 #18 NEW cov: 11737 ft: 13481 corp: 14/63b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeByte- 00:07:56.109 [2024-11-28 07:33:06.729153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.109 [2024-11-28 07:33:06.729179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.109 [2024-11-28 07:33:06.729235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.729248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.110 [2024-11-28 07:33:06.729297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.729310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.110 #19 NEW cov: 11737 ft: 13639 corp: 15/69b lim: 10 exec/s: 0 rss: 68Mb L: 6/9 MS: 1 CrossOver- 00:07:56.110 [2024-11-28 07:33:06.769143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.769168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.110 [2024-11-28 07:33:06.769220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000002d cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.769232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.110 #20 NEW cov: 11737 ft: 13663 corp: 16/73b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 ChangeByte- 00:07:56.110 [2024-11-28 07:33:06.809243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.809268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.110 [2024-11-28 07:33:06.809319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000f7 cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.809332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.110 #21 NEW cov: 11737 ft: 13709 corp: 17/77b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 ChangeBinInt- 00:07:56.110 [2024-11-28 07:33:06.839369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a7e cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.839395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.110 [2024-11-28 07:33:06.839446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.110 [2024-11-28 07:33:06.839460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.110 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.110 #22 NEW cov: 11760 ft: 13737 corp: 18/82b lim: 10 exec/s: 0 rss: 68Mb L: 5/9 MS: 1 InsertByte- 00:07:56.368 [2024-11-28 07:33:06.879519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.879546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.368 [2024-11-28 07:33:06.879595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.879614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.368 #23 NEW cov: 11760 ft: 13755 corp: 19/86b lim: 10 exec/s: 0 rss: 68Mb L: 4/9 MS: 1 ChangeByte- 00:07:56.368 [2024-11-28 07:33:06.909820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.909846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.368 [2024-11-28 07:33:06.909896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.909911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.368 [2024-11-28 07:33:06.909959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.909972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.368 [2024-11-28 07:33:06.910021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.910033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.368 #24 NEW cov: 11760 ft: 13803 corp: 20/95b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CopyPart- 00:07:56.368 [2024-11-28 07:33:06.949613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af9 cdw11:00000000 00:07:56.368 [2024-11-28 07:33:06.949639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.369 #25 NEW cov: 11760 ft: 13826 corp: 21/97b lim: 10 exec/s: 25 rss: 68Mb L: 2/9 MS: 1 ChangeBit- 00:07:56.369 [2024-11-28 07:33:06.989743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:06.989769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.369 #26 NEW cov: 11760 ft: 13887 corp: 22/99b lim: 10 exec/s: 26 rss: 68Mb L: 2/9 MS: 1 EraseBytes- 00:07:56.369 [2024-11-28 07:33:07.029939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.029964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.369 [2024-11-28 07:33:07.030014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002d2d cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.030027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.369 #27 NEW cov: 11760 ft: 13915 corp: 23/103b lim: 10 exec/s: 27 rss: 68Mb L: 4/9 MS: 1 CopyPart- 00:07:56.369 [2024-11-28 07:33:07.070065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.070090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.369 [2024-11-28 07:33:07.070140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.070154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.369 #28 NEW cov: 11760 ft: 13922 corp: 24/108b lim: 10 exec/s: 28 rss: 68Mb L: 5/9 MS: 1 InsertByte- 00:07:56.369 [2024-11-28 07:33:07.110405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ad9 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.110430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.369 [2024-11-28 07:33:07.110483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.110496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.369 [2024-11-28 07:33:07.110548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.110561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.369 [2024-11-28 07:33:07.110611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.369 [2024-11-28 07:33:07.110624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.369 #29 NEW cov: 11760 ft: 13977 corp: 25/116b lim: 10 exec/s: 29 rss: 68Mb L: 8/9 MS: 1 CrossOver- 00:07:56.627 [2024-11-28 07:33:07.150260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.627 [2024-11-28 07:33:07.150287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.627 [2024-11-28 07:33:07.150340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e500 cdw11:00000000 00:07:56.627 [2024-11-28 07:33:07.150354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.627 #30 NEW cov: 11760 ft: 14020 corp: 26/120b lim: 10 exec/s: 30 rss: 68Mb L: 4/9 MS: 1 ChangeByte- 00:07:56.628 [2024-11-28 07:33:07.180465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.180491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.180544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.180558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.180610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.180624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.628 #31 NEW cov: 11760 ft: 14042 corp: 27/126b lim: 10 exec/s: 31 rss: 68Mb L: 6/9 MS: 1 CopyPart- 00:07:56.628 [2024-11-28 07:33:07.220707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.220733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.220785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.220798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.220847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.220861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.220908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.220921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.628 #32 NEW cov: 11760 ft: 14059 corp: 28/135b lim: 10 exec/s: 32 rss: 68Mb L: 9/9 MS: 1 CopyPart- 00:07:56.628 [2024-11-28 07:33:07.260909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.260934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.260984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.260997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.261048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.261062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.261111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.261124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.261173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff2d cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.261186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.628 #33 NEW cov: 11760 ft: 14099 corp: 29/145b lim: 10 exec/s: 33 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:56.628 [2024-11-28 07:33:07.300690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.300716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.300766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006e00 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.300779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.628 #34 NEW cov: 11760 ft: 14103 corp: 30/149b lim: 10 exec/s: 34 rss: 68Mb L: 4/10 MS: 1 CrossOver- 00:07:56.628 [2024-11-28 07:33:07.340810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.340835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.340884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002d25 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.340897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.628 #35 NEW cov: 11760 ft: 14109 corp: 31/153b lim: 10 exec/s: 35 rss: 68Mb L: 4/10 MS: 1 ChangeByte- 00:07:56.628 [2024-11-28 07:33:07.380938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.380965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.628 [2024-11-28 07:33:07.381013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:56.628 [2024-11-28 07:33:07.381027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 #36 NEW cov: 11760 ft: 14122 corp: 32/157b lim: 10 exec/s: 36 rss: 68Mb L: 4/10 MS: 1 ChangeByte- 00:07:56.887 [2024-11-28 07:33:07.421022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.421049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.421099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000034 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.421113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 #37 NEW cov: 11760 ft: 14199 corp: 33/162b lim: 10 exec/s: 37 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:07:56.887 [2024-11-28 07:33:07.461367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.461396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.461450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.461463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.461515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.461529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.461579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.461592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.887 #38 NEW cov: 11760 ft: 14214 corp: 34/171b lim: 10 exec/s: 38 rss: 68Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:56.887 [2024-11-28 07:33:07.501478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.501504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.501552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.501565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.501616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.501631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.501680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.501694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.887 #39 NEW cov: 11760 ft: 14227 corp: 35/180b lim: 10 exec/s: 39 rss: 68Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:56.887 [2024-11-28 07:33:07.541387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.541412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.541464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.541478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 #40 NEW cov: 11760 ft: 14229 corp: 36/184b lim: 10 exec/s: 40 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:07:56.887 [2024-11-28 07:33:07.581853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.581878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.581927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.581940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.581990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.582003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.582054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.582067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.582117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.582131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.887 #41 NEW cov: 11760 ft: 14253 corp: 37/194b lim: 10 exec/s: 41 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:07:56.887 [2024-11-28 07:33:07.621608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.621633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.621684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.621697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 #42 NEW cov: 11760 ft: 14279 corp: 38/198b lim: 10 exec/s: 42 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:07:56.887 [2024-11-28 07:33:07.651950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.651975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.652027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005454 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.652040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.652091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005400 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.652104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.887 [2024-11-28 07:33:07.652152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 00:07:56.887 [2024-11-28 07:33:07.652165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.147 #43 NEW cov: 11760 ft: 14292 corp: 39/207b lim: 10 exec/s: 43 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:57.147 [2024-11-28 07:33:07.692203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.692229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.692279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.692292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.692342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.692355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.692402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000005f cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.692415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.692467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.692480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.147 #44 NEW cov: 11760 ft: 14302 corp: 40/217b lim: 10 exec/s: 44 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:07:57.147 [2024-11-28 07:33:07.732188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a00 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.732215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.732265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.732279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.732322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.732335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.732385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.732399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.147 #45 NEW cov: 11760 ft: 14315 corp: 41/226b lim: 10 exec/s: 45 rss: 69Mb L: 9/10 MS: 1 ChangeBit- 00:07:57.147 [2024-11-28 07:33:07.762022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.762048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.762099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.762113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.147 #46 NEW cov: 11760 ft: 14351 corp: 42/230b lim: 10 exec/s: 46 rss: 69Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:57.147 [2024-11-28 07:33:07.802251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.802277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.802328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.802341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.802393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.802406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.147 #47 NEW cov: 11760 ft: 14354 corp: 43/237b lim: 10 exec/s: 47 rss: 69Mb L: 7/10 MS: 1 CopyPart- 00:07:57.147 [2024-11-28 07:33:07.842595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.842624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.842677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.842698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.842752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.842766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.147 [2024-11-28 07:33:07.842817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.147 [2024-11-28 07:33:07.842829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.148 [2024-11-28 07:33:07.842880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000029 cdw11:00000000 00:07:57.148 [2024-11-28 07:33:07.842893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.148 #48 NEW cov: 11760 ft: 14380 corp: 44/247b lim: 10 exec/s: 48 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:07:57.148 [2024-11-28 07:33:07.882591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000026e cdw11:00000000 00:07:57.148 [2024-11-28 07:33:07.882621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.148 [2024-11-28 07:33:07.882674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000505 cdw11:00000000 00:07:57.148 [2024-11-28 07:33:07.882687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.148 [2024-11-28 07:33:07.882742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000505 cdw11:00000000 00:07:57.148 [2024-11-28 07:33:07.882755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.148 [2024-11-28 07:33:07.882808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000505 cdw11:00000000 00:07:57.148 [2024-11-28 07:33:07.882822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.148 #49 NEW cov: 11760 ft: 14388 corp: 45/255b lim: 10 exec/s: 49 rss: 69Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:57.406 [2024-11-28 07:33:07.922474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.406 [2024-11-28 07:33:07.922499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.406 [2024-11-28 07:33:07.922550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.406 [2024-11-28 07:33:07.922563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.406 #50 NEW cov: 11760 ft: 14451 corp: 46/259b lim: 10 exec/s: 50 rss: 69Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:57.406 [2024-11-28 07:33:07.962578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007474 cdw11:00000000 00:07:57.406 [2024-11-28 07:33:07.962609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.406 [2024-11-28 07:33:07.962660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000740a cdw11:00000000 00:07:57.406 [2024-11-28 07:33:07.962673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.406 #51 NEW cov: 11760 ft: 14452 corp: 47/264b lim: 10 exec/s: 25 rss: 69Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:57.406 #51 DONE cov: 11760 ft: 14452 corp: 47/264b lim: 10 exec/s: 25 rss: 69Mb 00:07:57.406 Done 51 runs in 2 second(s) 00:07:57.406 07:33:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:57.406 07:33:08 -- ../common.sh@72 -- # (( i++ )) 00:07:57.406 07:33:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.406 07:33:08 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:57.406 07:33:08 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:57.406 07:33:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.406 07:33:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.406 07:33:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:57.406 07:33:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:57.406 07:33:08 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:57.406 07:33:08 -- nvmf/run.sh@29 -- # port=4408 00:07:57.406 07:33:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:57.406 07:33:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:57.406 07:33:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.406 07:33:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:57.406 [2024-11-28 07:33:08.136094] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:57.406 [2024-11-28 07:33:08.136165] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661183 ] 00:07:57.406 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.665 [2024-11-28 07:33:08.311535] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.665 [2024-11-28 07:33:08.330773] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.665 [2024-11-28 07:33:08.330887] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.665 [2024-11-28 07:33:08.382071] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.665 [2024-11-28 07:33:08.398433] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:57.665 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.665 INFO: Seed: 2396707427 00:07:57.665 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:57.665 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:57.665 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:57.665 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.923 [2024-11-28 07:33:08.446960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.923 [2024-11-28 07:33:08.446990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.923 #2 INITED cov: 11560 ft: 11561 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:57.923 [2024-11-28 07:33:08.477345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.923 [2024-11-28 07:33:08.477371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.923 [2024-11-28 07:33:08.477424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.923 [2024-11-28 07:33:08.477438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.923 [2024-11-28 07:33:08.477487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.923 [2024-11-28 07:33:08.477501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.923 [2024-11-28 07:33:08.477554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.923 [2024-11-28 07:33:08.477567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.182 NEW_FUNC[1/1]: 0x17a2ed8 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:412 00:07:58.182 #3 NEW cov: 11674 ft: 12648 corp: 2/5b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:58.182 [2024-11-28 07:33:08.777826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.777858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.182 #4 NEW cov: 11680 ft: 12975 corp: 3/6b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeByte- 00:07:58.182 [2024-11-28 07:33:08.817889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.817917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.182 #5 NEW cov: 11765 ft: 13351 corp: 4/7b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:58.182 [2024-11-28 07:33:08.858503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.858533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.858593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.858613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.858670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.858684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.858744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.858757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.182 #6 NEW cov: 11765 ft: 13447 corp: 5/11b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBit- 00:07:58.182 [2024-11-28 07:33:08.898771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.898797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.898856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.898870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.898924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.898938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.898997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.899011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.899066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.899080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.182 #7 NEW cov: 11765 ft: 13599 corp: 6/16b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CrossOver- 00:07:58.182 [2024-11-28 07:33:08.938654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.938679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.938735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.938750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.938801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.938816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.182 [2024-11-28 07:33:08.938866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.182 [2024-11-28 07:33:08.938880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.441 #8 NEW cov: 11765 ft: 13647 corp: 7/20b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 EraseBytes- 00:07:58.441 [2024-11-28 07:33:08.978330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:08.978356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.441 #9 NEW cov: 11765 ft: 13723 corp: 8/21b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 CopyPart- 00:07:58.441 [2024-11-28 07:33:09.018906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.018932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.441 [2024-11-28 07:33:09.019004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.019018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.441 [2024-11-28 07:33:09.019072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.019086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.441 [2024-11-28 07:33:09.019119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.019132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.441 #10 NEW cov: 11765 ft: 13746 corp: 9/25b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:58.441 [2024-11-28 07:33:09.058567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.058592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.441 #11 NEW cov: 11765 ft: 13839 corp: 10/26b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:07:58.441 [2024-11-28 07:33:09.098707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.098732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.441 #12 NEW cov: 11765 ft: 13884 corp: 11/27b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:07:58.441 [2024-11-28 07:33:09.138813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.138838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.441 #13 NEW cov: 11765 ft: 13990 corp: 12/28b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:58.441 [2024-11-28 07:33:09.179375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.179400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.441 [2024-11-28 07:33:09.179456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.179470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.441 [2024-11-28 07:33:09.179526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.179539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.441 [2024-11-28 07:33:09.179594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.441 [2024-11-28 07:33:09.179612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.441 #14 NEW cov: 11765 ft: 14080 corp: 13/32b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CopyPart- 00:07:58.699 [2024-11-28 07:33:09.219191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.699 [2024-11-28 07:33:09.219216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.699 [2024-11-28 07:33:09.219289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.699 [2024-11-28 07:33:09.219304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.699 #15 NEW cov: 11765 ft: 14333 corp: 14/34b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 CopyPart- 00:07:58.699 [2024-11-28 07:33:09.259135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.699 [2024-11-28 07:33:09.259160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.699 #16 NEW cov: 11765 ft: 14414 corp: 15/35b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:07:58.699 [2024-11-28 07:33:09.299456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.699 [2024-11-28 07:33:09.299481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.299537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.299551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.700 #17 NEW cov: 11765 ft: 14469 corp: 16/37b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:58.700 [2024-11-28 07:33:09.340021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.340045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.340102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.340116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.340170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.340185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.340239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.340253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.340307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.340321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.700 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.700 #18 NEW cov: 11788 ft: 14531 corp: 17/42b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertByte- 00:07:58.700 [2024-11-28 07:33:09.379677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.379703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.379757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.379772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.700 #19 NEW cov: 11788 ft: 14545 corp: 18/44b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:58.700 [2024-11-28 07:33:09.419814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.419839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.419899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.419913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.700 #20 NEW cov: 11788 ft: 14587 corp: 19/46b lim: 5 exec/s: 20 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:58.700 [2024-11-28 07:33:09.460062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.460088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.460148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.460162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.700 [2024-11-28 07:33:09.460217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.700 [2024-11-28 07:33:09.460231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.959 #21 NEW cov: 11788 ft: 14802 corp: 20/49b lim: 5 exec/s: 21 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:07:58.959 [2024-11-28 07:33:09.499885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.499910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.959 #22 NEW cov: 11788 ft: 14823 corp: 21/50b lim: 5 exec/s: 22 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:58.959 [2024-11-28 07:33:09.540135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.540160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.959 [2024-11-28 07:33:09.540216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.540230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.959 #23 NEW cov: 11788 ft: 14839 corp: 22/52b lim: 5 exec/s: 23 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:58.959 [2024-11-28 07:33:09.580147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.580172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.959 #24 NEW cov: 11788 ft: 14866 corp: 23/53b lim: 5 exec/s: 24 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:58.959 [2024-11-28 07:33:09.610170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.610195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.959 #25 NEW cov: 11788 ft: 14908 corp: 24/54b lim: 5 exec/s: 25 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:07:58.959 [2024-11-28 07:33:09.650283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.650308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.959 #26 NEW cov: 11788 ft: 14917 corp: 25/55b lim: 5 exec/s: 26 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:07:58.959 [2024-11-28 07:33:09.690570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.690595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.959 [2024-11-28 07:33:09.690655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.959 [2024-11-28 07:33:09.690669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.959 #27 NEW cov: 11788 ft: 14929 corp: 26/57b lim: 5 exec/s: 27 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:59.218 [2024-11-28 07:33:09.730960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.730985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.218 [2024-11-28 07:33:09.731046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.731059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.218 [2024-11-28 07:33:09.731131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.731145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.218 #28 NEW cov: 11788 ft: 14934 corp: 27/60b lim: 5 exec/s: 28 rss: 68Mb L: 3/5 MS: 1 ChangeBit- 00:07:59.218 [2024-11-28 07:33:09.770855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.770881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.218 [2024-11-28 07:33:09.770936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.770951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.218 #29 NEW cov: 11788 ft: 14943 corp: 28/62b lim: 5 exec/s: 29 rss: 68Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:59.218 [2024-11-28 07:33:09.810768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.810792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.218 #30 NEW cov: 11788 ft: 14972 corp: 29/63b lim: 5 exec/s: 30 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:59.218 [2024-11-28 07:33:09.851008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.851033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.218 [2024-11-28 07:33:09.851091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.851105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.218 #31 NEW cov: 11788 ft: 15023 corp: 30/65b lim: 5 exec/s: 31 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:59.218 [2024-11-28 07:33:09.890995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.891020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.218 #32 NEW cov: 11788 ft: 15047 corp: 31/66b lim: 5 exec/s: 32 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:07:59.218 [2024-11-28 07:33:09.931096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.931121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.218 #33 NEW cov: 11788 ft: 15073 corp: 32/67b lim: 5 exec/s: 33 rss: 69Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:59.218 [2024-11-28 07:33:09.971223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.218 [2024-11-28 07:33:09.971249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 #34 NEW cov: 11788 ft: 15096 corp: 33/68b lim: 5 exec/s: 34 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:59.478 [2024-11-28 07:33:10.011338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.011364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 #35 NEW cov: 11788 ft: 15164 corp: 34/69b lim: 5 exec/s: 35 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:07:59.478 [2024-11-28 07:33:10.051521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.051547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 #36 NEW cov: 11788 ft: 15181 corp: 35/70b lim: 5 exec/s: 36 rss: 69Mb L: 1/5 MS: 1 ChangeByte- 00:07:59.478 [2024-11-28 07:33:10.092089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.092116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.092172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.092187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.092241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.092256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.092307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.092321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.478 #37 NEW cov: 11788 ft: 15187 corp: 36/74b lim: 5 exec/s: 37 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:59.478 [2024-11-28 07:33:10.131697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.131722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 #38 NEW cov: 11788 ft: 15214 corp: 37/75b lim: 5 exec/s: 38 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:59.478 [2024-11-28 07:33:10.172402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.172427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.172481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.172495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.172549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.172563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.172619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.172632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.172686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.172699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.478 #39 NEW cov: 11788 ft: 15224 corp: 38/80b lim: 5 exec/s: 39 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:59.478 [2024-11-28 07:33:10.212555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.212581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.212636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.212652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.212705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.212719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.212771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.212785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.478 [2024-11-28 07:33:10.212839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.478 [2024-11-28 07:33:10.212853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.478 #40 NEW cov: 11788 ft: 15233 corp: 39/85b lim: 5 exec/s: 40 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:59.738 [2024-11-28 07:33:10.252478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.252502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.252579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.252594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.252656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.252671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.252726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.252740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.738 #41 NEW cov: 11788 ft: 15255 corp: 40/89b lim: 5 exec/s: 41 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:59.738 [2024-11-28 07:33:10.292435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.292460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.292518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.292532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.292588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.292606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.738 #42 NEW cov: 11788 ft: 15267 corp: 41/92b lim: 5 exec/s: 42 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:59.738 [2024-11-28 07:33:10.332439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.332463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.332518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.332532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.738 #43 NEW cov: 11788 ft: 15274 corp: 42/94b lim: 5 exec/s: 43 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:59.738 [2024-11-28 07:33:10.372412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.372436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.738 #44 NEW cov: 11788 ft: 15280 corp: 43/95b lim: 5 exec/s: 44 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:59.738 [2024-11-28 07:33:10.402942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.402967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.403021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.403038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.403091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.403104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.738 [2024-11-28 07:33:10.403156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.403169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.738 #45 NEW cov: 11788 ft: 15285 corp: 44/99b lim: 5 exec/s: 45 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:59.738 [2024-11-28 07:33:10.442547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.738 [2024-11-28 07:33:10.442572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.738 #46 NEW cov: 11788 ft: 15297 corp: 45/100b lim: 5 exec/s: 23 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:59.738 #46 DONE cov: 11788 ft: 15297 corp: 45/100b lim: 5 exec/s: 23 rss: 69Mb 00:07:59.738 Done 46 runs in 2 second(s) 00:07:59.998 07:33:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:59.998 07:33:10 -- ../common.sh@72 -- # (( i++ )) 00:07:59.998 07:33:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.998 07:33:10 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:59.998 07:33:10 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:59.998 07:33:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.998 07:33:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.998 07:33:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:59.998 07:33:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:59.998 07:33:10 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:59.998 07:33:10 -- nvmf/run.sh@29 -- # port=4409 00:07:59.998 07:33:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:59.998 07:33:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:59.998 07:33:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.998 07:33:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:59.998 [2024-11-28 07:33:10.615157] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:59.998 [2024-11-28 07:33:10.615229] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661230 ] 00:07:59.998 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.256 [2024-11-28 07:33:10.795800] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.256 [2024-11-28 07:33:10.815535] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.256 [2024-11-28 07:33:10.815655] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.256 [2024-11-28 07:33:10.866896] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.256 [2024-11-28 07:33:10.883284] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:00.256 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.256 INFO: Seed: 587732957 00:08:00.256 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:00.256 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:00.256 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:00.256 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.256 [2024-11-28 07:33:10.928500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.256 [2024-11-28 07:33:10.928529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.256 #2 INITED cov: 11552 ft: 11553 corp: 1/1b exec/s: 0 rss: 65Mb 00:08:00.256 [2024-11-28 07:33:10.958453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.256 [2024-11-28 07:33:10.958479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.515 NEW_FUNC[1/1]: 0x1723ef8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:08:00.515 #3 NEW cov: 11674 ft: 12090 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeBit- 00:08:00.515 [2024-11-28 07:33:11.269802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.515 [2024-11-28 07:33:11.269841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.515 [2024-11-28 07:33:11.269907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.515 [2024-11-28 07:33:11.269925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.515 [2024-11-28 07:33:11.269984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.515 [2024-11-28 07:33:11.270002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.515 [2024-11-28 07:33:11.270062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.515 [2024-11-28 07:33:11.270080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.774 #4 NEW cov: 11680 ft: 13134 corp: 3/6b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:00.774 [2024-11-28 07:33:11.319807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.319833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.319889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.319903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.319956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.319970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.320023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.320040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.774 #5 NEW cov: 11765 ft: 13346 corp: 4/10b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBit- 00:08:00.774 [2024-11-28 07:33:11.369462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.369488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.774 #6 NEW cov: 11765 ft: 13465 corp: 5/11b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ShuffleBytes- 00:08:00.774 [2024-11-28 07:33:11.409726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.409751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.409806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.409820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.774 #7 NEW cov: 11765 ft: 13720 corp: 6/13b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 InsertByte- 00:08:00.774 [2024-11-28 07:33:11.449996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.450021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.450076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.450090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.450144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.450158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.774 #8 NEW cov: 11765 ft: 13983 corp: 7/16b lim: 5 exec/s: 0 rss: 67Mb L: 3/4 MS: 1 InsertByte- 00:08:00.774 [2024-11-28 07:33:11.489996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.490022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.490078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.490092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.774 #9 NEW cov: 11765 ft: 14058 corp: 8/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 EraseBytes- 00:08:00.774 [2024-11-28 07:33:11.530058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.530084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.774 [2024-11-28 07:33:11.530139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.774 [2024-11-28 07:33:11.530156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.033 #10 NEW cov: 11765 ft: 14113 corp: 9/20b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeByte- 00:08:01.033 [2024-11-28 07:33:11.570016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.570041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.033 #11 NEW cov: 11765 ft: 14232 corp: 10/21b lim: 5 exec/s: 0 rss: 67Mb L: 1/4 MS: 1 ChangeByte- 00:08:01.033 [2024-11-28 07:33:11.610338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.610363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.610416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.610430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.033 #12 NEW cov: 11765 ft: 14247 corp: 11/23b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeByte- 00:08:01.033 [2024-11-28 07:33:11.650758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.650783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.650839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.650852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.650907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.650920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.650974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.650988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.033 #13 NEW cov: 11765 ft: 14333 corp: 12/27b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 CrossOver- 00:08:01.033 [2024-11-28 07:33:11.690820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.690845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.690901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.690916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.690967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.690981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.691033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.691049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.033 #14 NEW cov: 11765 ft: 14353 corp: 13/31b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:01.033 [2024-11-28 07:33:11.731098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.731123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.731176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.731190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.731241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.731255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.731308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.731322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.731372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.731386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.033 #15 NEW cov: 11765 ft: 14474 corp: 14/36b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertByte- 00:08:01.033 [2024-11-28 07:33:11.770740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.770765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.033 [2024-11-28 07:33:11.770816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.033 [2024-11-28 07:33:11.770831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.033 #16 NEW cov: 11765 ft: 14501 corp: 15/38b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:01.292 [2024-11-28 07:33:11.810856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.810881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.810935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.810949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.292 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.292 #17 NEW cov: 11788 ft: 14580 corp: 16/40b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:08:01.292 [2024-11-28 07:33:11.861518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.861546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.861604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.861617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.861672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.861686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.861738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.861752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.861806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.861820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.292 #18 NEW cov: 11788 ft: 14587 corp: 17/45b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:01.292 [2024-11-28 07:33:11.901269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.901293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.901347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.292 [2024-11-28 07:33:11.901361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.292 [2024-11-28 07:33:11.901416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:11.901430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.293 #19 NEW cov: 11788 ft: 14598 corp: 18/48b lim: 5 exec/s: 19 rss: 68Mb L: 3/5 MS: 1 CrossOver- 00:08:01.293 [2024-11-28 07:33:11.941098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:11.941122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.293 #20 NEW cov: 11788 ft: 14615 corp: 19/49b lim: 5 exec/s: 20 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:01.293 [2024-11-28 07:33:11.971557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:11.971582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.293 [2024-11-28 07:33:11.971640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:11.971655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.293 [2024-11-28 07:33:11.971710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:11.971727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.293 #21 NEW cov: 11788 ft: 14634 corp: 20/52b lim: 5 exec/s: 21 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:08:01.293 [2024-11-28 07:33:12.011307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:12.011331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.293 #22 NEW cov: 11788 ft: 14663 corp: 21/53b lim: 5 exec/s: 22 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:08:01.293 [2024-11-28 07:33:12.051593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:12.051621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.293 [2024-11-28 07:33:12.051678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.293 [2024-11-28 07:33:12.051691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.551 #23 NEW cov: 11788 ft: 14669 corp: 22/55b lim: 5 exec/s: 23 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:01.551 [2024-11-28 07:33:12.091516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.551 [2024-11-28 07:33:12.091541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.551 #24 NEW cov: 11788 ft: 14677 corp: 23/56b lim: 5 exec/s: 24 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:08:01.551 [2024-11-28 07:33:12.121909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.551 [2024-11-28 07:33:12.121934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.551 [2024-11-28 07:33:12.121991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.551 [2024-11-28 07:33:12.122004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.122058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.122072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.552 #25 NEW cov: 11788 ft: 14689 corp: 24/59b lim: 5 exec/s: 25 rss: 68Mb L: 3/5 MS: 1 ChangeBit- 00:08:01.552 [2024-11-28 07:33:12.162362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.162387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.162440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.162454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.162508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.162524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.162575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.162589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.162645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.162659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.552 #26 NEW cov: 11788 ft: 14719 corp: 25/64b lim: 5 exec/s: 26 rss: 68Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:01.552 [2024-11-28 07:33:12.212055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.212079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.212137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.212151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.552 #27 NEW cov: 11788 ft: 14738 corp: 26/66b lim: 5 exec/s: 27 rss: 68Mb L: 2/5 MS: 1 EraseBytes- 00:08:01.552 [2024-11-28 07:33:12.252440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.252466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.252523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.252538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.252595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.252613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.252671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.252684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.552 #28 NEW cov: 11788 ft: 14744 corp: 27/70b lim: 5 exec/s: 28 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:08:01.552 [2024-11-28 07:33:12.292554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.292579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.292638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.292653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.292710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.292727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.552 [2024-11-28 07:33:12.292780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.552 [2024-11-28 07:33:12.292794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.552 #29 NEW cov: 11788 ft: 14753 corp: 28/74b lim: 5 exec/s: 29 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:08:01.811 [2024-11-28 07:33:12.332829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.332854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.332912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.332926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.332982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.332996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.333052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.333066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.333121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.333135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.811 #30 NEW cov: 11788 ft: 14761 corp: 29/79b lim: 5 exec/s: 30 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:08:01.811 [2024-11-28 07:33:12.372643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.372670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.372728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.372742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.372796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.372810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.811 #31 NEW cov: 11788 ft: 14773 corp: 30/82b lim: 5 exec/s: 31 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:08:01.811 [2024-11-28 07:33:12.412469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.412494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 #32 NEW cov: 11788 ft: 14800 corp: 31/83b lim: 5 exec/s: 32 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:08:01.811 [2024-11-28 07:33:12.443021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.443047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.443103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.443118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.443170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.443184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.443235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.443249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.811 #33 NEW cov: 11788 ft: 14812 corp: 32/87b lim: 5 exec/s: 33 rss: 68Mb L: 4/5 MS: 1 EraseBytes- 00:08:01.811 [2024-11-28 07:33:12.482962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.482987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.483047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.483061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.483115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.483129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.811 #34 NEW cov: 11788 ft: 14817 corp: 33/90b lim: 5 exec/s: 34 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:08:01.811 [2024-11-28 07:33:12.523246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.523272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.523327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.523342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.523394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.523408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.523462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.523475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.811 #35 NEW cov: 11788 ft: 14823 corp: 34/94b lim: 5 exec/s: 35 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:01.811 [2024-11-28 07:33:12.563186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.563212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.563269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.563283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.811 [2024-11-28 07:33:12.563336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.811 [2024-11-28 07:33:12.563350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.070 #36 NEW cov: 11788 ft: 14836 corp: 35/97b lim: 5 exec/s: 36 rss: 68Mb L: 3/5 MS: 1 CrossOver- 00:08:02.070 [2024-11-28 07:33:12.602995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.603020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.070 #37 NEW cov: 11788 ft: 14847 corp: 36/98b lim: 5 exec/s: 37 rss: 69Mb L: 1/5 MS: 1 ChangeBinInt- 00:08:02.070 [2024-11-28 07:33:12.643425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.643450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.643504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.643519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.643573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.643587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.070 #38 NEW cov: 11788 ft: 14849 corp: 37/101b lim: 5 exec/s: 38 rss: 69Mb L: 3/5 MS: 1 ChangeByte- 00:08:02.070 [2024-11-28 07:33:12.683379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.683403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.683459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.683473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.070 #39 NEW cov: 11788 ft: 14873 corp: 38/103b lim: 5 exec/s: 39 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:08:02.070 [2024-11-28 07:33:12.723527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.723552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.723610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.723625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.070 #40 NEW cov: 11788 ft: 14879 corp: 39/105b lim: 5 exec/s: 40 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:08:02.070 [2024-11-28 07:33:12.764094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.764118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.764172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.764187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.764239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.764253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.764305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.070 [2024-11-28 07:33:12.764319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.070 [2024-11-28 07:33:12.764371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.071 [2024-11-28 07:33:12.764384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.071 #41 NEW cov: 11788 ft: 14905 corp: 40/110b lim: 5 exec/s: 41 rss: 69Mb L: 5/5 MS: 1 InsertByte- 00:08:02.071 [2024-11-28 07:33:12.803879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.071 [2024-11-28 07:33:12.803904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.071 [2024-11-28 07:33:12.803959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.071 [2024-11-28 07:33:12.803972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.071 [2024-11-28 07:33:12.804027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.071 [2024-11-28 07:33:12.804041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.071 #42 NEW cov: 11788 ft: 14961 corp: 41/113b lim: 5 exec/s: 42 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:08:02.330 [2024-11-28 07:33:12.843866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.843891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.330 [2024-11-28 07:33:12.843945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.843958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.330 #43 NEW cov: 11788 ft: 14972 corp: 42/115b lim: 5 exec/s: 43 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:02.330 [2024-11-28 07:33:12.883952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.883978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.330 [2024-11-28 07:33:12.884031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.884045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.330 #44 NEW cov: 11788 ft: 14979 corp: 43/117b lim: 5 exec/s: 44 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:08:02.330 [2024-11-28 07:33:12.924522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.924546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.330 [2024-11-28 07:33:12.924603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.924617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.330 [2024-11-28 07:33:12.924689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.924703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.330 [2024-11-28 07:33:12.924760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.924775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.330 [2024-11-28 07:33:12.924831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.330 [2024-11-28 07:33:12.924845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.330 #45 NEW cov: 11788 ft: 14982 corp: 44/122b lim: 5 exec/s: 22 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:08:02.330 #45 DONE cov: 11788 ft: 14982 corp: 44/122b lim: 5 exec/s: 22 rss: 69Mb 00:08:02.330 Done 45 runs in 2 second(s) 00:08:02.330 07:33:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:08:02.330 07:33:13 -- ../common.sh@72 -- # (( i++ )) 00:08:02.330 07:33:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.330 07:33:13 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:02.330 07:33:13 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:02.330 07:33:13 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.330 07:33:13 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.330 07:33:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:02.330 07:33:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:02.330 07:33:13 -- nvmf/run.sh@29 -- # printf %02d 10 00:08:02.330 07:33:13 -- nvmf/run.sh@29 -- # port=4410 00:08:02.330 07:33:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:02.330 07:33:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:02.330 07:33:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.330 07:33:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:08:02.589 [2024-11-28 07:33:13.104495] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:02.589 [2024-11-28 07:33:13.104567] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661273 ] 00:08:02.589 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.589 [2024-11-28 07:33:13.277950] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.589 [2024-11-28 07:33:13.297664] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.589 [2024-11-28 07:33:13.297778] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.589 [2024-11-28 07:33:13.348989] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.847 [2024-11-28 07:33:13.365376] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:02.847 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.847 INFO: Seed: 3068775498 00:08:02.847 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:02.847 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:02.847 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:02.847 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.847 #2 INITED exec/s: 0 rss: 60Mb 00:08:02.847 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.847 This may also happen if the target rejected all inputs we tried so far 00:08:02.847 [2024-11-28 07:33:13.414248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.847 [2024-11-28 07:33:13.414278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.847 [2024-11-28 07:33:13.414339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.847 [2024-11-28 07:33:13.414355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.106 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:03.106 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.106 #8 NEW cov: 11584 ft: 11585 corp: 2/18b lim: 40 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:08:03.106 [2024-11-28 07:33:13.714962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.714994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.106 [2024-11-28 07:33:13.715056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.715070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.106 #14 NEW cov: 11697 ft: 11950 corp: 3/35b lim: 40 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 ChangeBinInt- 00:08:03.106 [2024-11-28 07:33:13.754994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.755021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.106 [2024-11-28 07:33:13.755084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2d2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.755099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.106 #15 NEW cov: 11703 ft: 12234 corp: 4/52b lim: 40 exec/s: 0 rss: 67Mb L: 17/17 MS: 1 ChangeBit- 00:08:03.106 [2024-11-28 07:33:13.795077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:910af2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.795103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.106 [2024-11-28 07:33:13.795178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.795192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.106 #16 NEW cov: 11788 ft: 12672 corp: 5/70b lim: 40 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 InsertByte- 00:08:03.106 [2024-11-28 07:33:13.835233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:910af2f2 cdw11:f2f2f272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.835260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.106 [2024-11-28 07:33:13.835322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.835336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.106 #17 NEW cov: 11788 ft: 12782 corp: 6/88b lim: 40 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeBit- 00:08:03.106 [2024-11-28 07:33:13.875384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:f2f2f2f2 cdw11:910af2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.875410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.106 [2024-11-28 07:33:13.875471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2170d0a cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.106 [2024-11-28 07:33:13.875485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.365 #19 NEW cov: 11788 ft: 12862 corp: 7/105b lim: 40 exec/s: 0 rss: 67Mb L: 17/18 MS: 2 CrossOver-CrossOver- 00:08:03.365 [2024-11-28 07:33:13.915411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f8f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.915437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:13.915497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.915511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.365 #20 NEW cov: 11788 ft: 12892 corp: 8/122b lim: 40 exec/s: 0 rss: 67Mb L: 17/18 MS: 1 ChangeBinInt- 00:08:03.365 [2024-11-28 07:33:13.955725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f8f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.955751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:13.955818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.955835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:13.955893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.955908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.365 #21 NEW cov: 11788 ft: 13176 corp: 9/152b lim: 40 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 CrossOver- 00:08:03.365 [2024-11-28 07:33:13.995990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.996017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:13.996078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.996093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:13.996152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.996166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:13.996225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:13.996240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.365 #22 NEW cov: 11788 ft: 13707 corp: 10/189b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:03.365 [2024-11-28 07:33:14.035812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.035838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:14.035906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.035921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.365 #23 NEW cov: 11788 ft: 13784 corp: 11/206b lim: 40 exec/s: 0 rss: 68Mb L: 17/37 MS: 1 ShuffleBytes- 00:08:03.365 [2024-11-28 07:33:14.076193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.076219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:14.076282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f225 cdw11:f2000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.076297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:14.076356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.076370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.365 [2024-11-28 07:33:14.076434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.076448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.365 #24 NEW cov: 11788 ft: 13908 corp: 12/243b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 ChangeBinInt- 00:08:03.365 [2024-11-28 07:33:14.115946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:000af2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.365 [2024-11-28 07:33:14.115972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 #28 NEW cov: 11788 ft: 14291 corp: 13/254b lim: 40 exec/s: 0 rss: 68Mb L: 11/37 MS: 4 CrossOver-CopyPart-CrossOver-CrossOver- 00:08:03.624 [2024-11-28 07:33:14.156008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.156034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 #29 NEW cov: 11788 ft: 14304 corp: 14/264b lim: 40 exec/s: 0 rss: 68Mb L: 10/37 MS: 1 EraseBytes- 00:08:03.624 [2024-11-28 07:33:14.196285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:910af2f2 cdw11:f2f2f272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.196310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 [2024-11-28 07:33:14.196369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f206 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.196384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.624 #30 NEW cov: 11788 ft: 14313 corp: 15/282b lim: 40 exec/s: 0 rss: 68Mb L: 18/37 MS: 1 ChangeByte- 00:08:03.624 [2024-11-28 07:33:14.236438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.236463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 [2024-11-28 07:33:14.236523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.236537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.624 #31 NEW cov: 11788 ft: 14331 corp: 16/299b lim: 40 exec/s: 0 rss: 68Mb L: 17/37 MS: 1 ShuffleBytes- 00:08:03.624 [2024-11-28 07:33:14.276514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.276539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 [2024-11-28 07:33:14.276602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f22c cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.276617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.624 #32 NEW cov: 11788 ft: 14384 corp: 17/317b lim: 40 exec/s: 0 rss: 68Mb L: 18/37 MS: 1 InsertByte- 00:08:03.624 [2024-11-28 07:33:14.316615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.316640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 [2024-11-28 07:33:14.316706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.316720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.624 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.624 #33 NEW cov: 11811 ft: 14427 corp: 18/334b lim: 40 exec/s: 0 rss: 68Mb L: 17/37 MS: 1 CopyPart- 00:08:03.624 [2024-11-28 07:33:14.356763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:f20af2f2 cdw11:9172f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.356789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.624 [2024-11-28 07:33:14.356851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.624 [2024-11-28 07:33:14.356866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.624 #34 NEW cov: 11811 ft: 14441 corp: 19/352b lim: 40 exec/s: 0 rss: 68Mb L: 18/37 MS: 1 ShuffleBytes- 00:08:03.884 [2024-11-28 07:33:14.396894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2ec cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.396920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.396981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.396996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.884 #35 NEW cov: 11811 ft: 14499 corp: 20/369b lim: 40 exec/s: 35 rss: 68Mb L: 17/37 MS: 1 ChangeBinInt- 00:08:03.884 [2024-11-28 07:33:14.437134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.437159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.437223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.437238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.437300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffff2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.437314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.884 #36 NEW cov: 11811 ft: 14513 corp: 21/398b lim: 40 exec/s: 36 rss: 68Mb L: 29/37 MS: 1 InsertRepeatedBytes- 00:08:03.884 [2024-11-28 07:33:14.477098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.477124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.477185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f272f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.477199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.884 #37 NEW cov: 11811 ft: 14515 corp: 22/415b lim: 40 exec/s: 37 rss: 68Mb L: 17/37 MS: 1 ChangeBit- 00:08:03.884 [2024-11-28 07:33:14.517333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.517360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.517421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.517438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.517500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.517515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.884 #38 NEW cov: 11811 ft: 14524 corp: 23/441b lim: 40 exec/s: 38 rss: 68Mb L: 26/37 MS: 1 CopyPart- 00:08:03.884 [2024-11-28 07:33:14.557173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:28606060 cdw11:60606060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.557198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 #42 NEW cov: 11811 ft: 14552 corp: 24/454b lim: 40 exec/s: 42 rss: 68Mb L: 13/37 MS: 4 InsertByte-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:03.884 [2024-11-28 07:33:14.597760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.597786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.597845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.597860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.597920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.597935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.597994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2170d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.598009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.637781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.637807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.637867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.637882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.637943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.637960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.884 [2024-11-28 07:33:14.638020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f20d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.884 [2024-11-28 07:33:14.638034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.143 #44 NEW cov: 11811 ft: 14558 corp: 25/486b lim: 40 exec/s: 44 rss: 68Mb L: 32/37 MS: 2 InsertRepeatedBytes-CopyPart- 00:08:04.143 [2024-11-28 07:33:14.677689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f6f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.677716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.677777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f272f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.677791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 #45 NEW cov: 11811 ft: 14612 corp: 26/503b lim: 40 exec/s: 45 rss: 68Mb L: 17/37 MS: 1 ChangeBit- 00:08:04.143 [2024-11-28 07:33:14.718084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.718110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.718171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.718185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.718243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.718257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.718313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.718327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.143 #46 NEW cov: 11811 ft: 14727 corp: 27/541b lim: 40 exec/s: 46 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:08:04.143 [2024-11-28 07:33:14.758063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.758089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.758152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f24242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.758167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.758230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242d2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.758244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.143 #47 NEW cov: 11811 ft: 14738 corp: 28/566b lim: 40 exec/s: 47 rss: 69Mb L: 25/38 MS: 1 InsertRepeatedBytes- 00:08:04.143 [2024-11-28 07:33:14.798027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e0d0d cdw11:0d0d0d0d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.798055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.798119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:03f2f2f2 cdw11:f2f2f217 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.798133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 #48 NEW cov: 11811 ft: 14810 corp: 29/583b lim: 40 exec/s: 48 rss: 69Mb L: 17/38 MS: 1 ChangeBinInt- 00:08:04.143 [2024-11-28 07:33:14.828295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f8f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.828319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.828381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.828395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.828456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.828471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.868281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f8f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.868307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.868368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.868382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 #50 NEW cov: 11811 ft: 14818 corp: 30/603b lim: 40 exec/s: 50 rss: 69Mb L: 20/38 MS: 2 CopyPart-EraseBytes- 00:08:04.143 [2024-11-28 07:33:14.908631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af8f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.908656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.908720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.908735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.143 [2024-11-28 07:33:14.908797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.143 [2024-11-28 07:33:14.908811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.144 [2024-11-28 07:33:14.908873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.144 [2024-11-28 07:33:14.908888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.414 #51 NEW cov: 11811 ft: 14834 corp: 31/641b lim: 40 exec/s: 51 rss: 69Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:04.414 [2024-11-28 07:33:14.948629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f228 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:14.948654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.414 [2024-11-28 07:33:14.948717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:14.948732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.414 [2024-11-28 07:33:14.948793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:14.948807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.414 #52 NEW cov: 11811 ft: 14853 corp: 32/668b lim: 40 exec/s: 52 rss: 69Mb L: 27/38 MS: 1 InsertByte- 00:08:04.414 [2024-11-28 07:33:14.988762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:14.988788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.414 [2024-11-28 07:33:14.988854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:14.988868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.414 [2024-11-28 07:33:14.988932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffff2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:14.988946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.414 #53 NEW cov: 11811 ft: 14863 corp: 33/697b lim: 40 exec/s: 53 rss: 69Mb L: 29/38 MS: 1 ShuffleBytes- 00:08:04.414 [2024-11-28 07:33:15.028631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:15.028656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.414 #54 NEW cov: 11811 ft: 14881 corp: 34/707b lim: 40 exec/s: 54 rss: 69Mb L: 10/38 MS: 1 ChangeBit- 00:08:04.414 [2024-11-28 07:33:15.068700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:15.068725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.414 #55 NEW cov: 11811 ft: 14889 corp: 35/720b lim: 40 exec/s: 55 rss: 69Mb L: 13/38 MS: 1 EraseBytes- 00:08:04.414 [2024-11-28 07:33:15.108977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:15.109003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.414 [2024-11-28 07:33:15.109063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2d2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:15.109078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.414 #56 NEW cov: 11811 ft: 14906 corp: 36/737b lim: 40 exec/s: 56 rss: 69Mb L: 17/38 MS: 1 CopyPart- 00:08:04.414 [2024-11-28 07:33:15.148967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.414 [2024-11-28 07:33:15.148996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.414 #57 NEW cov: 11811 ft: 14934 corp: 37/750b lim: 40 exec/s: 57 rss: 69Mb L: 13/38 MS: 1 ChangeByte- 00:08:04.672 [2024-11-28 07:33:15.189221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:910af2f2 cdw11:f2f2f272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.672 [2024-11-28 07:33:15.189246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.672 [2024-11-28 07:33:15.189309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.189324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.673 #58 NEW cov: 11811 ft: 14935 corp: 38/768b lim: 40 exec/s: 58 rss: 69Mb L: 18/38 MS: 1 ChangeBit- 00:08:04.673 [2024-11-28 07:33:15.219310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:30f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.219335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.219398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f272f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.219413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.673 #59 NEW cov: 11811 ft: 14991 corp: 39/785b lim: 40 exec/s: 59 rss: 69Mb L: 17/38 MS: 1 ChangeByte- 00:08:04.673 [2024-11-28 07:33:15.259578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:f2f2f2f2 cdw11:910af2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.259608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.259670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f1f1f1 cdw11:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.259684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.259725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f117 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.259740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.673 #60 NEW cov: 11811 ft: 15042 corp: 40/816b lim: 40 exec/s: 60 rss: 69Mb L: 31/38 MS: 1 InsertRepeatedBytes- 00:08:04.673 [2024-11-28 07:33:15.299734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.299759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.299825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.299839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.299881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.299898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.673 #61 NEW cov: 11811 ft: 15063 corp: 41/842b lim: 40 exec/s: 61 rss: 69Mb L: 26/38 MS: 1 ShuffleBytes- 00:08:04.673 [2024-11-28 07:33:15.339912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0af2f228 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.339937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.339998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f2f2ffff cdw11:fffffff2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.340012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.340072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2f2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.340085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.340145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f2f2f2f2 cdw11:f2f2170d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.340159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.673 #62 NEW cov: 11811 ft: 15084 corp: 42/874b lim: 40 exec/s: 62 rss: 69Mb L: 32/38 MS: 1 InsertRepeatedBytes- 00:08:04.673 [2024-11-28 07:33:15.379924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a21f2ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.379949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.380010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.380024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.673 [2024-11-28 07:33:15.380084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffff2f2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.673 [2024-11-28 07:33:15.380098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.673 #63 NEW cov: 11811 ft: 15089 corp: 43/903b lim: 40 exec/s: 31 rss: 69Mb L: 29/38 MS: 1 ChangeByte- 00:08:04.673 #63 DONE cov: 11811 ft: 15089 corp: 43/903b lim: 40 exec/s: 31 rss: 69Mb 00:08:04.673 Done 63 runs in 2 second(s) 00:08:04.932 07:33:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:04.932 07:33:15 -- ../common.sh@72 -- # (( i++ )) 00:08:04.932 07:33:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.932 07:33:15 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:04.932 07:33:15 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:04.932 07:33:15 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.932 07:33:15 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.932 07:33:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:04.932 07:33:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:04.932 07:33:15 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:04.932 07:33:15 -- nvmf/run.sh@29 -- # port=4411 00:08:04.932 07:33:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:04.932 07:33:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:04.932 07:33:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.932 07:33:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:04.932 [2024-11-28 07:33:15.551654] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:04.932 [2024-11-28 07:33:15.551725] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661316 ] 00:08:04.932 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.191 [2024-11-28 07:33:15.733081] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.191 [2024-11-28 07:33:15.752919] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.191 [2024-11-28 07:33:15.753052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.191 [2024-11-28 07:33:15.804296] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.191 [2024-11-28 07:33:15.820680] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:05.191 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.191 INFO: Seed: 1228768601 00:08:05.191 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:05.191 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:05.191 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:05.191 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.191 #2 INITED exec/s: 0 rss: 59Mb 00:08:05.191 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.191 This may also happen if the target rejected all inputs we tried so far 00:08:05.191 [2024-11-28 07:33:15.868432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.191 [2024-11-28 07:33:15.868465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.449 NEW_FUNC[1/671]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:05.449 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.449 #4 NEW cov: 11592 ft: 11591 corp: 2/15b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:05.449 [2024-11-28 07:33:16.189203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.449 [2024-11-28 07:33:16.189240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.708 #5 NEW cov: 11709 ft: 12061 corp: 3/29b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 ChangeBit- 00:08:05.708 [2024-11-28 07:33:16.259288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.259317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.708 #6 NEW cov: 11715 ft: 12361 corp: 4/44b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 InsertByte- 00:08:05.708 [2024-11-28 07:33:16.319440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.319471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.708 #7 NEW cov: 11800 ft: 12734 corp: 5/59b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 CopyPart- 00:08:05.708 [2024-11-28 07:33:16.379729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.379762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.708 [2024-11-28 07:33:16.379811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.379834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.708 [2024-11-28 07:33:16.379863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:00040000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.379878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.708 #8 NEW cov: 11800 ft: 13539 corp: 6/86b lim: 40 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:05.708 [2024-11-28 07:33:16.439863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.439894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.708 [2024-11-28 07:33:16.439928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82868282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.439944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.708 [2024-11-28 07:33:16.439976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:82000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.708 [2024-11-28 07:33:16.439991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.967 #9 NEW cov: 11800 ft: 13724 corp: 7/114b lim: 40 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertByte- 00:08:05.967 [2024-11-28 07:33:16.510050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.510080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.967 [2024-11-28 07:33:16.510128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82868282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.510144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.967 [2024-11-28 07:33:16.510173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:82000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.510188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.967 #10 NEW cov: 11800 ft: 13775 corp: 8/142b lim: 40 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ChangeBit- 00:08:05.967 [2024-11-28 07:33:16.580146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a002000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.580176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.967 #11 NEW cov: 11800 ft: 13823 corp: 9/156b lim: 40 exec/s: 0 rss: 67Mb L: 14/28 MS: 1 ChangeBit- 00:08:05.967 [2024-11-28 07:33:16.630379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.630410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.967 [2024-11-28 07:33:16.630448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.630464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.967 [2024-11-28 07:33:16.630494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:5d828282 cdw11:82000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.630510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.967 #12 NEW cov: 11800 ft: 13923 corp: 10/184b lim: 40 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertByte- 00:08:05.967 [2024-11-28 07:33:16.680357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.680388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.967 #13 NEW cov: 11800 ft: 13978 corp: 11/199b lim: 40 exec/s: 0 rss: 67Mb L: 15/28 MS: 1 CopyPart- 00:08:05.967 [2024-11-28 07:33:16.730537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a00de00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.967 [2024-11-28 07:33:16.730567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.226 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.226 #14 NEW cov: 11817 ft: 14059 corp: 12/214b lim: 40 exec/s: 0 rss: 68Mb L: 15/28 MS: 1 InsertByte- 00:08:06.226 [2024-11-28 07:33:16.781614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.781648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.226 [2024-11-28 07:33:16.781715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82868282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.781734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.226 [2024-11-28 07:33:16.781799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.781817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.226 #15 NEW cov: 11817 ft: 14122 corp: 13/242b lim: 40 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 CopyPart- 00:08:06.226 [2024-11-28 07:33:16.821356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.821382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.226 #16 NEW cov: 11817 ft: 14266 corp: 14/256b lim: 40 exec/s: 0 rss: 68Mb L: 14/28 MS: 1 ChangeByte- 00:08:06.226 [2024-11-28 07:33:16.861752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00960000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.861777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.226 [2024-11-28 07:33:16.861835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.861849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.226 [2024-11-28 07:33:16.861890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:825d8282 cdw11:82820004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.861904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.226 #17 NEW cov: 11817 ft: 14299 corp: 15/285b lim: 40 exec/s: 17 rss: 68Mb L: 29/29 MS: 1 InsertByte- 00:08:06.226 [2024-11-28 07:33:16.901582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.901612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.226 #18 NEW cov: 11817 ft: 14338 corp: 16/300b lim: 40 exec/s: 18 rss: 68Mb L: 15/29 MS: 1 CrossOver- 00:08:06.226 [2024-11-28 07:33:16.941813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.941839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.226 [2024-11-28 07:33:16.941894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.941907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.226 #19 NEW cov: 11817 ft: 14549 corp: 17/321b lim: 40 exec/s: 19 rss: 68Mb L: 21/29 MS: 1 CopyPart- 00:08:06.226 [2024-11-28 07:33:16.981782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00040a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.226 [2024-11-28 07:33:16.981806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.485 #21 NEW cov: 11817 ft: 14566 corp: 18/329b lim: 40 exec/s: 21 rss: 68Mb L: 8/29 MS: 2 CrossOver-CrossOver- 00:08:06.485 [2024-11-28 07:33:17.021910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.485 [2024-11-28 07:33:17.021936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.485 #22 NEW cov: 11817 ft: 14585 corp: 19/343b lim: 40 exec/s: 22 rss: 68Mb L: 14/29 MS: 1 ChangeBinInt- 00:08:06.485 [2024-11-28 07:33:17.052259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00960000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.485 [2024-11-28 07:33:17.052284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.485 [2024-11-28 07:33:17.052342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.485 [2024-11-28 07:33:17.052356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.485 [2024-11-28 07:33:17.052412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:825d8282 cdw11:82820004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.485 [2024-11-28 07:33:17.052425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.486 #23 NEW cov: 11817 ft: 14607 corp: 20/373b lim: 40 exec/s: 23 rss: 68Mb L: 30/30 MS: 1 InsertByte- 00:08:06.486 [2024-11-28 07:33:17.092276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a007000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.092301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.092359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.092373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.486 #24 NEW cov: 11817 ft: 14625 corp: 21/395b lim: 40 exec/s: 24 rss: 68Mb L: 22/30 MS: 1 InsertByte- 00:08:06.486 [2024-11-28 07:33:17.132418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.132442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.132498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82868282 cdw11:82820004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.132512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.486 #25 NEW cov: 11817 ft: 14640 corp: 22/416b lim: 40 exec/s: 25 rss: 68Mb L: 21/30 MS: 1 EraseBytes- 00:08:06.486 [2024-11-28 07:33:17.172760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ae4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.172784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.172840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.172854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.172908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.172922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.172975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.172989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.486 #26 NEW cov: 11817 ft: 14942 corp: 23/453b lim: 40 exec/s: 26 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:06.486 [2024-11-28 07:33:17.212450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.212476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.486 #27 NEW cov: 11817 ft: 14959 corp: 24/467b lim: 40 exec/s: 27 rss: 68Mb L: 14/37 MS: 1 ChangeBit- 00:08:06.486 [2024-11-28 07:33:17.253062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0ae4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.253087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.253142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.253156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.253208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.253225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.486 [2024-11-28 07:33:17.253280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e4e4e4e4 cdw11:e4e4e4e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.486 [2024-11-28 07:33:17.253294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.745 #33 NEW cov: 11817 ft: 14981 corp: 25/504b lim: 40 exec/s: 33 rss: 68Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:06.745 [2024-11-28 07:33:17.302733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a002000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.302757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.745 #34 NEW cov: 11817 ft: 15012 corp: 26/518b lim: 40 exec/s: 34 rss: 68Mb L: 14/37 MS: 1 ShuffleBytes- 00:08:06.745 [2024-11-28 07:33:17.342978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.343003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.745 [2024-11-28 07:33:17.343057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00040000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.343071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.745 #35 NEW cov: 11817 ft: 15021 corp: 27/537b lim: 40 exec/s: 35 rss: 68Mb L: 19/37 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:06.745 [2024-11-28 07:33:17.382922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.382947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.745 #36 NEW cov: 11817 ft: 15031 corp: 28/551b lim: 40 exec/s: 36 rss: 68Mb L: 14/37 MS: 1 ShuffleBytes- 00:08:06.745 [2024-11-28 07:33:17.423205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00008282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.423230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.745 [2024-11-28 07:33:17.423285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:86828282 cdw11:82820004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.423299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.745 #37 NEW cov: 11817 ft: 15043 corp: 29/572b lim: 40 exec/s: 37 rss: 69Mb L: 21/37 MS: 1 CopyPart- 00:08:06.745 [2024-11-28 07:33:17.463169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:09fc0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.463194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.745 #38 NEW cov: 11817 ft: 15048 corp: 30/586b lim: 40 exec/s: 38 rss: 69Mb L: 14/37 MS: 1 ChangeBinInt- 00:08:06.745 [2024-11-28 07:33:17.503266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:093cfc00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.745 [2024-11-28 07:33:17.503291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 #39 NEW cov: 11817 ft: 15058 corp: 31/601b lim: 40 exec/s: 39 rss: 69Mb L: 15/37 MS: 1 InsertByte- 00:08:07.004 [2024-11-28 07:33:17.543837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00960000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.543865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 [2024-11-28 07:33:17.543920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.543934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.004 [2024-11-28 07:33:17.543988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:825d8282 cdw11:2a2a2a2a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.544001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.004 [2024-11-28 07:33:17.544055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:82820004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.544069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.004 #40 NEW cov: 11817 ft: 15067 corp: 32/634b lim: 40 exec/s: 40 rss: 69Mb L: 33/37 MS: 1 InsertRepeatedBytes- 00:08:07.004 [2024-11-28 07:33:17.583617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a007000 cdw11:5a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.583644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 [2024-11-28 07:33:17.583700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.583713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.004 #41 NEW cov: 11817 ft: 15090 corp: 33/657b lim: 40 exec/s: 41 rss: 69Mb L: 23/37 MS: 1 InsertByte- 00:08:07.004 [2024-11-28 07:33:17.623630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:097efc00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.623656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 #42 NEW cov: 11817 ft: 15101 corp: 34/672b lim: 40 exec/s: 42 rss: 69Mb L: 15/37 MS: 1 InsertByte- 00:08:07.004 [2024-11-28 07:33:17.663669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a00001a cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.663694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 #43 NEW cov: 11817 ft: 15121 corp: 35/687b lim: 40 exec/s: 43 rss: 69Mb L: 15/37 MS: 1 ChangeByte- 00:08:07.004 [2024-11-28 07:33:17.704100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.704126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 [2024-11-28 07:33:17.704184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82868282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.704198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.004 [2024-11-28 07:33:17.704253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:8200fcff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.704266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.004 #44 NEW cov: 11817 ft: 15141 corp: 36/715b lim: 40 exec/s: 44 rss: 69Mb L: 28/37 MS: 1 ChangeBinInt- 00:08:07.004 [2024-11-28 07:33:17.743903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.004 [2024-11-28 07:33:17.743927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.004 #45 NEW cov: 11824 ft: 15182 corp: 37/729b lim: 40 exec/s: 45 rss: 69Mb L: 14/37 MS: 1 ChangeBit- 00:08:07.263 [2024-11-28 07:33:17.774051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:d2000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.774077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.263 #46 NEW cov: 11824 ft: 15190 corp: 38/744b lim: 40 exec/s: 46 rss: 69Mb L: 15/37 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:07.263 [2024-11-28 07:33:17.814292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00008282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.814316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.263 [2024-11-28 07:33:17.814373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82820004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.814388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.263 #47 NEW cov: 11824 ft: 15193 corp: 39/761b lim: 40 exec/s: 47 rss: 69Mb L: 17/37 MS: 1 EraseBytes- 00:08:07.263 [2024-11-28 07:33:17.854988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.855013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.263 [2024-11-28 07:33:17.855070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00960000 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.855084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.263 [2024-11-28 07:33:17.855141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:825d8282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.855155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.263 [2024-11-28 07:33:17.855212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:82820004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.263 [2024-11-28 07:33:17.855226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.263 #48 NEW cov: 11824 ft: 15203 corp: 40/798b lim: 40 exec/s: 24 rss: 69Mb L: 37/37 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:07.263 #48 DONE cov: 11824 ft: 15203 corp: 40/798b lim: 40 exec/s: 24 rss: 69Mb 00:08:07.263 ###### Recommended dictionary. ###### 00:08:07.263 "\000\000\000\000" # Uses: 2 00:08:07.263 ###### End of recommended dictionary. ###### 00:08:07.263 Done 48 runs in 2 second(s) 00:08:07.263 07:33:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:07.263 07:33:17 -- ../common.sh@72 -- # (( i++ )) 00:08:07.263 07:33:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.263 07:33:17 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:07.263 07:33:17 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:07.263 07:33:17 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.264 07:33:17 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.264 07:33:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:07.264 07:33:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:07.264 07:33:17 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:07.264 07:33:17 -- nvmf/run.sh@29 -- # port=4412 00:08:07.264 07:33:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:07.264 07:33:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:07.264 07:33:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.264 07:33:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:07.522 [2024-11-28 07:33:18.038411] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:07.522 [2024-11-28 07:33:18.038481] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661369 ] 00:08:07.522 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.522 [2024-11-28 07:33:18.214495] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.522 [2024-11-28 07:33:18.233847] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.522 [2024-11-28 07:33:18.233960] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.522 [2024-11-28 07:33:18.285166] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.781 [2024-11-28 07:33:18.301540] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:07.781 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.781 INFO: Seed: 3711768936 00:08:07.781 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:07.781 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:07.781 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:07.781 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.781 #2 INITED exec/s: 0 rss: 59Mb 00:08:07.781 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.781 This may also happen if the target rejected all inputs we tried so far 00:08:07.781 [2024-11-28 07:33:18.347174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.781 [2024-11-28 07:33:18.347202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.781 [2024-11-28 07:33:18.347259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.781 [2024-11-28 07:33:18.347273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.781 [2024-11-28 07:33:18.347327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.781 [2024-11-28 07:33:18.347340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.781 [2024-11-28 07:33:18.347394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.782 [2024-11-28 07:33:18.347408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.041 NEW_FUNC[1/671]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:08.041 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.041 #5 NEW cov: 11594 ft: 11595 corp: 2/35b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:08.041 [2024-11-28 07:33:18.658198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.658230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.658290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.658305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.658360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.658374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.658430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.658445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.658503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:66666666 cdw11:6666660e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.658518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.041 #6 NEW cov: 11707 ft: 12067 corp: 3/75b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:08.041 [2024-11-28 07:33:18.708159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.708186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.708246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.708260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.708318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.708333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.708390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.708404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.041 #7 NEW cov: 11713 ft: 12440 corp: 4/109b lim: 40 exec/s: 0 rss: 67Mb L: 34/40 MS: 1 CopyPart- 00:08:08.041 [2024-11-28 07:33:18.748438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.748464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.748522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.748540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.748596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.748614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.748671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.748686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.748741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:66666666 cdw11:6666660e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.748755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.041 #8 NEW cov: 11798 ft: 12644 corp: 5/149b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CopyPart- 00:08:08.041 [2024-11-28 07:33:18.788354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.041 [2024-11-28 07:33:18.788380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.041 [2024-11-28 07:33:18.788439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.042 [2024-11-28 07:33:18.788454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.042 [2024-11-28 07:33:18.788511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.042 [2024-11-28 07:33:18.788525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.042 [2024-11-28 07:33:18.788582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.042 [2024-11-28 07:33:18.788595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.042 #9 NEW cov: 11798 ft: 12809 corp: 6/184b lim: 40 exec/s: 0 rss: 67Mb L: 35/40 MS: 1 CopyPart- 00:08:08.301 [2024-11-28 07:33:18.828449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.828475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.828534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.828549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.828606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.828620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.828678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.828692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.301 #10 NEW cov: 11798 ft: 12931 corp: 7/219b lim: 40 exec/s: 0 rss: 67Mb L: 35/40 MS: 1 CrossOver- 00:08:08.301 [2024-11-28 07:33:18.868584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.868613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.868677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.868692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.868752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.868766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.868822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.868836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.301 #11 NEW cov: 11798 ft: 12982 corp: 8/255b lim: 40 exec/s: 0 rss: 67Mb L: 36/40 MS: 1 InsertByte- 00:08:08.301 [2024-11-28 07:33:18.908741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.908768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.908826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.908841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.908897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666466 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.908912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.301 [2024-11-28 07:33:18.908967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.301 [2024-11-28 07:33:18.908982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.301 #12 NEW cov: 11798 ft: 13004 corp: 9/290b lim: 40 exec/s: 0 rss: 67Mb L: 35/40 MS: 1 ChangeBit- 00:08:08.302 [2024-11-28 07:33:18.948892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.948917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:18.948973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a666666 cdw11:6666a199 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.948989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:18.949044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:99996666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.949058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:18.949117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.949132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.302 #13 NEW cov: 11798 ft: 13027 corp: 10/325b lim: 40 exec/s: 0 rss: 67Mb L: 35/40 MS: 1 ChangeBinInt- 00:08:08.302 [2024-11-28 07:33:18.988957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.988982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:18.989041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.989055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:18.989111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.989125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:18.989181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:18.989195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.302 #14 NEW cov: 11798 ft: 13045 corp: 11/359b lim: 40 exec/s: 0 rss: 67Mb L: 34/40 MS: 1 CrossOver- 00:08:08.302 [2024-11-28 07:33:19.029296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.029321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.029380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff6666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.029394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.029450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.029464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.029520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.029534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.029590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:66666666 cdw11:6666660e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.029608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.302 #15 NEW cov: 11798 ft: 13081 corp: 12/399b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:08.302 [2024-11-28 07:33:19.069294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.069320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.069392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.069407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.069465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.069479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.302 [2024-11-28 07:33:19.069535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.302 [2024-11-28 07:33:19.069549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.562 #21 NEW cov: 11798 ft: 13099 corp: 13/433b lim: 40 exec/s: 0 rss: 67Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:08.562 [2024-11-28 07:33:19.109293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.109319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.109377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666665 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.109392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.109449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.109463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.109522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.109536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.562 #22 NEW cov: 11798 ft: 13163 corp: 14/467b lim: 40 exec/s: 0 rss: 67Mb L: 34/40 MS: 1 ChangeBinInt- 00:08:08.562 [2024-11-28 07:33:19.149500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.149527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.149585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.149604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.149661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.149675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.149730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.149745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.562 #23 NEW cov: 11798 ft: 13257 corp: 15/500b lim: 40 exec/s: 0 rss: 67Mb L: 33/40 MS: 1 EraseBytes- 00:08:08.562 [2024-11-28 07:33:19.189558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.189585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.189652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.189667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.189727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.189742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.189803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666600 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.189818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.562 #24 NEW cov: 11798 ft: 13269 corp: 16/535b lim: 40 exec/s: 0 rss: 67Mb L: 35/40 MS: 1 ChangeBinInt- 00:08:08.562 [2024-11-28 07:33:19.229659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.229685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.229747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.229761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.229821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.229834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.229895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.229909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.562 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.562 #25 NEW cov: 11821 ft: 13323 corp: 17/570b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 InsertByte- 00:08:08.562 [2024-11-28 07:33:19.269753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a66e866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.269779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.269816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.269831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.269888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.562 [2024-11-28 07:33:19.269903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.562 [2024-11-28 07:33:19.269963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.563 [2024-11-28 07:33:19.269977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.563 #26 NEW cov: 11821 ft: 13451 corp: 18/605b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 ChangeByte- 00:08:08.563 [2024-11-28 07:33:19.309923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.563 [2024-11-28 07:33:19.309949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.563 [2024-11-28 07:33:19.310008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.563 [2024-11-28 07:33:19.310022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.563 [2024-11-28 07:33:19.310077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.563 [2024-11-28 07:33:19.310091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.563 [2024-11-28 07:33:19.310149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666600 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.563 [2024-11-28 07:33:19.310163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.822 #27 NEW cov: 11821 ft: 13456 corp: 19/640b lim: 40 exec/s: 27 rss: 68Mb L: 35/40 MS: 1 ShuffleBytes- 00:08:08.822 [2024-11-28 07:33:19.350072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:290a6666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.822 [2024-11-28 07:33:19.350097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.822 [2024-11-28 07:33:19.350156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.822 [2024-11-28 07:33:19.350170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.822 [2024-11-28 07:33:19.350205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.822 [2024-11-28 07:33:19.350219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.822 [2024-11-28 07:33:19.350279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.350293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.823 #28 NEW cov: 11821 ft: 13519 corp: 20/675b lim: 40 exec/s: 28 rss: 68Mb L: 35/40 MS: 1 InsertByte- 00:08:08.823 [2024-11-28 07:33:19.390128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.390153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.390211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.390227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.390282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.390296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.390352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.390366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.823 #29 NEW cov: 11821 ft: 13531 corp: 21/711b lim: 40 exec/s: 29 rss: 68Mb L: 36/40 MS: 1 ChangeBit- 00:08:08.823 [2024-11-28 07:33:19.429976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.430001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.430061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.430075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.823 #30 NEW cov: 11821 ft: 13878 corp: 22/731b lim: 40 exec/s: 30 rss: 68Mb L: 20/40 MS: 1 EraseBytes- 00:08:08.823 [2024-11-28 07:33:19.470034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.470060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.470121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.470136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.823 #31 NEW cov: 11821 ft: 13937 corp: 23/752b lim: 40 exec/s: 31 rss: 68Mb L: 21/40 MS: 1 CrossOver- 00:08:08.823 [2024-11-28 07:33:19.510173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.510198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.510258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.510273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.823 #32 NEW cov: 11821 ft: 13969 corp: 24/773b lim: 40 exec/s: 32 rss: 68Mb L: 21/40 MS: 1 CrossOver- 00:08:08.823 [2024-11-28 07:33:19.550633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.550659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.550720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.550734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.550795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.550811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.550871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.550885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.823 #33 NEW cov: 11821 ft: 13976 corp: 25/810b lim: 40 exec/s: 33 rss: 68Mb L: 37/40 MS: 1 CopyPart- 00:08:08.823 [2024-11-28 07:33:19.590684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.590710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.590771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.590785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.590841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:666666ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.590855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.823 [2024-11-28 07:33:19.590915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffff0166 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.823 [2024-11-28 07:33:19.590929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.083 #34 NEW cov: 11821 ft: 14036 corp: 26/843b lim: 40 exec/s: 34 rss: 68Mb L: 33/40 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\001"- 00:08:09.083 [2024-11-28 07:33:19.630485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.630510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.630571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.630585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.083 #35 NEW cov: 11821 ft: 14047 corp: 27/863b lim: 40 exec/s: 35 rss: 69Mb L: 20/40 MS: 1 ChangeByte- 00:08:09.083 [2024-11-28 07:33:19.670992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.671018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.671077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:666666ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.671092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.671150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.671165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.671221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.671238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.083 #36 NEW cov: 11821 ft: 14059 corp: 28/901b lim: 40 exec/s: 36 rss: 69Mb L: 38/40 MS: 1 CopyPart- 00:08:09.083 [2024-11-28 07:33:19.711054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.711080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.711139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.711154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.711214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.711228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.711286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.711300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.083 #37 NEW cov: 11821 ft: 14130 corp: 29/937b lim: 40 exec/s: 37 rss: 69Mb L: 36/40 MS: 1 CopyPart- 00:08:09.083 [2024-11-28 07:33:19.751257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.751283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.751342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.751357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.751413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.751427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.751484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.751498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.083 #38 NEW cov: 11821 ft: 14144 corp: 30/972b lim: 40 exec/s: 38 rss: 69Mb L: 35/40 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\001"- 00:08:09.083 [2024-11-28 07:33:19.791300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.791326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.791384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff6666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.791399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.791463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.791477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.791537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.791551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.083 #39 NEW cov: 11821 ft: 14153 corp: 31/1007b lim: 40 exec/s: 39 rss: 69Mb L: 35/40 MS: 1 CrossOver- 00:08:09.083 [2024-11-28 07:33:19.831393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.831418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.831477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:666666ff cdw11:ffffff01 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.831492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.831547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00999999 cdw11:99999266 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.831561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.083 [2024-11-28 07:33:19.831621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.083 [2024-11-28 07:33:19.831636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.343 #40 NEW cov: 11821 ft: 14158 corp: 32/1045b lim: 40 exec/s: 40 rss: 69Mb L: 38/40 MS: 1 ChangeBinInt- 00:08:09.343 [2024-11-28 07:33:19.871208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.871233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.871289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.871303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.911339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.911364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.911422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666601 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.911436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.343 #42 NEW cov: 11821 ft: 14195 corp: 33/1065b lim: 40 exec/s: 42 rss: 69Mb L: 20/40 MS: 2 ChangeBit-CMP- DE: "\001\000"- 00:08:09.343 [2024-11-28 07:33:19.951771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.951797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.951861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.951876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.951934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.951948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.952007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.952021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.343 #43 NEW cov: 11821 ft: 14280 corp: 34/1102b lim: 40 exec/s: 43 rss: 69Mb L: 37/40 MS: 1 ShuffleBytes- 00:08:09.343 [2024-11-28 07:33:19.991767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.991793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.991855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.991870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:19.991928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66667979 cdw11:79797979 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:19.991941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.343 #44 NEW cov: 11821 ft: 14473 corp: 35/1132b lim: 40 exec/s: 44 rss: 69Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:08:09.343 [2024-11-28 07:33:20.032012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.032040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:20.032100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6666660f cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.032115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:20.032172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.032186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:20.032243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.032257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.343 #45 NEW cov: 11821 ft: 14497 corp: 36/1166b lim: 40 exec/s: 45 rss: 69Mb L: 34/40 MS: 1 ChangeByte- 00:08:09.343 [2024-11-28 07:33:20.082230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.082258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:20.082320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff6666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.082335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:20.082393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.082407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.343 [2024-11-28 07:33:20.082466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.343 [2024-11-28 07:33:20.082480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.343 #46 NEW cov: 11821 ft: 14507 corp: 37/1201b lim: 40 exec/s: 46 rss: 69Mb L: 35/40 MS: 1 ChangeBit- 00:08:09.603 [2024-11-28 07:33:20.122136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:6666ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.122163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.603 [2024-11-28 07:33:20.122239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.122254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.603 [2024-11-28 07:33:20.122301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff01ffff cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.122316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.603 #47 NEW cov: 11821 ft: 14561 corp: 38/1230b lim: 40 exec/s: 47 rss: 69Mb L: 29/40 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\001"- 00:08:09.603 [2024-11-28 07:33:20.162049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666601 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.162074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.603 [2024-11-28 07:33:20.162136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.162150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.603 #48 NEW cov: 11821 ft: 14574 corp: 39/1252b lim: 40 exec/s: 48 rss: 69Mb L: 22/40 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:09.603 [2024-11-28 07:33:20.202693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.202719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.603 [2024-11-28 07:33:20.202798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff6666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.603 [2024-11-28 07:33:20.202813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.202872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.202890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.202951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.202965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.203028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:66666666 cdw11:66666604 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.203042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.604 #49 NEW cov: 11821 ft: 14582 corp: 40/1292b lim: 40 exec/s: 49 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:08:09.604 [2024-11-28 07:33:20.242632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.242658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.242744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff6666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.242759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.242819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66663966 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.242833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.242864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.242878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.282764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:ffffff66 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.282790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.282866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.282880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.282937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66663966 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.282951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.283011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.283025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.604 #51 NEW cov: 11821 ft: 14592 corp: 41/1327b lim: 40 exec/s: 51 rss: 69Mb L: 35/40 MS: 2 ChangeByte-CopyPart- 00:08:09.604 [2024-11-28 07:33:20.322911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.322937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.323000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.323015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.323074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.323088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.604 [2024-11-28 07:33:20.323145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.604 [2024-11-28 07:33:20.323159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.604 #52 NEW cov: 11821 ft: 14598 corp: 42/1365b lim: 40 exec/s: 26 rss: 70Mb L: 38/40 MS: 1 CopyPart- 00:08:09.604 #52 DONE cov: 11821 ft: 14598 corp: 42/1365b lim: 40 exec/s: 26 rss: 70Mb 00:08:09.604 ###### Recommended dictionary. ###### 00:08:09.604 "\377\377\377\377\377\377\377\001" # Uses: 2 00:08:09.604 "\001\000" # Uses: 1 00:08:09.604 ###### End of recommended dictionary. ###### 00:08:09.604 Done 52 runs in 2 second(s) 00:08:09.862 07:33:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:09.862 07:33:20 -- ../common.sh@72 -- # (( i++ )) 00:08:09.862 07:33:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.862 07:33:20 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:09.862 07:33:20 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:09.862 07:33:20 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.862 07:33:20 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.862 07:33:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:09.862 07:33:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:09.862 07:33:20 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:09.862 07:33:20 -- nvmf/run.sh@29 -- # port=4413 00:08:09.862 07:33:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:09.862 07:33:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:09.862 07:33:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.862 07:33:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:09.862 [2024-11-28 07:33:20.504824] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:09.862 [2024-11-28 07:33:20.504919] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661412 ] 00:08:09.862 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.120 [2024-11-28 07:33:20.681003] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.120 [2024-11-28 07:33:20.700877] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.120 [2024-11-28 07:33:20.701010] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.120 [2024-11-28 07:33:20.752469] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.120 [2024-11-28 07:33:20.768850] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:10.120 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.120 INFO: Seed: 1884814290 00:08:10.120 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:10.120 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:10.120 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:10.120 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.120 #2 INITED exec/s: 0 rss: 59Mb 00:08:10.120 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.120 This may also happen if the target rejected all inputs we tried so far 00:08:10.120 [2024-11-28 07:33:20.834860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.120 [2024-11-28 07:33:20.834898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.379 NEW_FUNC[1/670]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:10.379 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.379 #11 NEW cov: 11579 ft: 11580 corp: 2/12b lim: 40 exec/s: 0 rss: 67Mb L: 11/11 MS: 4 ChangeByte-InsertByte-InsertByte-CMP- DE: "\001\000\000\000\000\000\000\003"- 00:08:10.639 [2024-11-28 07:33:21.156192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2a060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.156230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.156362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.156379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.156508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.156525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.156661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.156679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.639 #18 NEW cov: 11695 ft: 12814 corp: 3/51b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:10.639 [2024-11-28 07:33:21.195690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.195718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.195842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00030000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.195857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.639 #19 NEW cov: 11701 ft: 13267 corp: 4/70b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\003"- 00:08:10.639 [2024-11-28 07:33:21.235571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a320100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.235602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 #24 NEW cov: 11786 ft: 13561 corp: 5/78b lim: 40 exec/s: 0 rss: 67Mb L: 8/39 MS: 5 CrossOver-InsertByte-CopyPart-ShuffleBytes-CrossOver- 00:08:10.639 [2024-11-28 07:33:21.275997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.276023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.276146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00030000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.276162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.639 #25 NEW cov: 11786 ft: 13684 corp: 6/98b lim: 40 exec/s: 0 rss: 67Mb L: 20/39 MS: 1 InsertByte- 00:08:10.639 [2024-11-28 07:33:21.316195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.316221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.316344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.316360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.316489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.316508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.639 #26 NEW cov: 11786 ft: 13923 corp: 7/127b lim: 40 exec/s: 0 rss: 67Mb L: 29/39 MS: 1 CrossOver- 00:08:10.639 [2024-11-28 07:33:21.356259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.356286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.356400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000300f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.356416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.639 #27 NEW cov: 11786 ft: 14034 corp: 8/148b lim: 40 exec/s: 0 rss: 67Mb L: 21/39 MS: 1 InsertByte- 00:08:10.639 [2024-11-28 07:33:21.396324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a320100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.396352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.639 [2024-11-28 07:33:21.396472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.639 [2024-11-28 07:33:21.396488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.898 #28 NEW cov: 11786 ft: 14111 corp: 9/165b lim: 40 exec/s: 0 rss: 67Mb L: 17/39 MS: 1 CrossOver- 00:08:10.898 [2024-11-28 07:33:21.436925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.436953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.437066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.437086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.437184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000300 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.437203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.437328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58580000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.437345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.898 #29 NEW cov: 11786 ft: 14142 corp: 10/204b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:10.898 [2024-11-28 07:33:21.486453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:36c39393 cdw11:93939393 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.486480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.898 #32 NEW cov: 11786 ft: 14163 corp: 11/213b lim: 40 exec/s: 0 rss: 68Mb L: 9/39 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:10.898 [2024-11-28 07:33:21.526721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.526748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.526884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00030000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.526902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.898 #33 NEW cov: 11786 ft: 14186 corp: 12/232b lim: 40 exec/s: 0 rss: 68Mb L: 19/39 MS: 1 ChangeBit- 00:08:10.898 [2024-11-28 07:33:21.567395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.567421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.567560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0606 cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.567577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.567708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.567724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.567847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.567865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.568000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:03c3e640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.568017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.898 #34 NEW cov: 11786 ft: 14244 corp: 13/272b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:08:10.898 [2024-11-28 07:33:21.617438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.617465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.617593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00030058 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.617615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.617747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.617777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.617910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58000000 cdw11:000003c3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.617927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.898 #35 NEW cov: 11786 ft: 14261 corp: 14/306b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 EraseBytes- 00:08:10.898 [2024-11-28 07:33:21.667294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.667320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.898 [2024-11-28 07:33:21.667440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0003003b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.898 [2024-11-28 07:33:21.667458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.157 #36 NEW cov: 11786 ft: 14267 corp: 15/327b lim: 40 exec/s: 0 rss: 68Mb L: 21/40 MS: 1 InsertByte- 00:08:11.157 [2024-11-28 07:33:21.707968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000506 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.707996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.708128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0606 cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.708145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.708270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.708304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.708434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.708452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.708594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:03c3e640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.708615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.157 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.157 #37 NEW cov: 11809 ft: 14312 corp: 16/367b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:08:11.157 [2024-11-28 07:33:21.747360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01ac0100 cdw11:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.747387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.747527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00030000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.747543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.157 #38 NEW cov: 11809 ft: 14342 corp: 17/386b lim: 40 exec/s: 0 rss: 68Mb L: 19/40 MS: 1 ChangeByte- 00:08:11.157 [2024-11-28 07:33:21.787900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2a060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.787927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.788052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.788068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.788187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.788203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.788326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.788343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.157 #39 NEW cov: 11809 ft: 14396 corp: 18/425b lim: 40 exec/s: 39 rss: 68Mb L: 39/40 MS: 1 ShuffleBytes- 00:08:11.157 [2024-11-28 07:33:21.837399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01060000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.837426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.157 #40 NEW cov: 11809 ft: 14421 corp: 19/436b lim: 40 exec/s: 40 rss: 68Mb L: 11/40 MS: 1 ChangeBinInt- 00:08:11.157 [2024-11-28 07:33:21.878096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.878123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.878243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.878262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.878384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f0f0f001 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.878401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.878522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:03000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.878541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.157 #41 NEW cov: 11809 ft: 14443 corp: 20/474b lim: 40 exec/s: 41 rss: 68Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:11.157 [2024-11-28 07:33:21.917878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.917905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.157 [2024-11-28 07:33:21.918037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00030000 cdw11:03000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.157 [2024-11-28 07:33:21.918054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.416 #42 NEW cov: 11809 ft: 14452 corp: 21/493b lim: 40 exec/s: 42 rss: 68Mb L: 19/40 MS: 1 ShuffleBytes- 00:08:11.416 [2024-11-28 07:33:21.957764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01060000 cdw11:76000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:21.957791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.416 #43 NEW cov: 11809 ft: 14457 corp: 22/505b lim: 40 exec/s: 43 rss: 68Mb L: 12/40 MS: 1 InsertByte- 00:08:11.416 [2024-11-28 07:33:21.997899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:21.997924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.416 #44 NEW cov: 11809 ft: 14465 corp: 23/517b lim: 40 exec/s: 44 rss: 68Mb L: 12/40 MS: 1 EraseBytes- 00:08:11.416 [2024-11-28 07:33:22.038209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:22.038236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.416 [2024-11-28 07:33:22.038362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000300f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:22.038379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.416 #45 NEW cov: 11809 ft: 14536 corp: 24/538b lim: 40 exec/s: 45 rss: 68Mb L: 21/40 MS: 1 ShuffleBytes- 00:08:11.416 [2024-11-28 07:33:22.078942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000506 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:22.078967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.416 [2024-11-28 07:33:22.079093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0606 cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:22.079116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.416 [2024-11-28 07:33:22.079246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.416 [2024-11-28 07:33:22.079262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.416 [2024-11-28 07:33:22.079393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:50585858 cdw11:58585800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.079413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.417 [2024-11-28 07:33:22.079553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:03c3e640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.079570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.417 #46 NEW cov: 11809 ft: 14547 corp: 25/578b lim: 40 exec/s: 46 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:08:11.417 [2024-11-28 07:33:22.129150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.129175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.417 [2024-11-28 07:33:22.129292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0606 cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.129309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.417 [2024-11-28 07:33:22.129434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.129450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.417 [2024-11-28 07:33:22.129577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00035858 cdw11:58585800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.129593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.417 [2024-11-28 07:33:22.129725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:03c3e640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.129742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.417 #47 NEW cov: 11809 ft: 14585 corp: 26/618b lim: 40 exec/s: 47 rss: 68Mb L: 40/40 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\003"- 00:08:11.417 [2024-11-28 07:33:22.168433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0106002d cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.417 [2024-11-28 07:33:22.168459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 #48 NEW cov: 11809 ft: 14604 corp: 27/629b lim: 40 exec/s: 48 rss: 68Mb L: 11/40 MS: 1 ChangeByte- 00:08:11.676 [2024-11-28 07:33:22.209269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.209296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 [2024-11-28 07:33:22.209413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0606 cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.209431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.676 [2024-11-28 07:33:22.209550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.209566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.676 [2024-11-28 07:33:22.209698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:58585800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.209716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.676 [2024-11-28 07:33:22.209838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:03e3e640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.209856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.676 #49 NEW cov: 11809 ft: 14617 corp: 28/669b lim: 40 exec/s: 49 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:08:11.676 [2024-11-28 07:33:22.248577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:36c39393 cdw11:93939300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.248607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 #50 NEW cov: 11809 ft: 14622 corp: 29/678b lim: 40 exec/s: 50 rss: 69Mb L: 9/40 MS: 1 CrossOver- 00:08:11.676 [2024-11-28 07:33:22.288804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.288832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 #51 NEW cov: 11809 ft: 14642 corp: 30/690b lim: 40 exec/s: 51 rss: 69Mb L: 12/40 MS: 1 EraseBytes- 00:08:11.676 [2024-11-28 07:33:22.329098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000108 cdw11:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.329125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 [2024-11-28 07:33:22.329254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00030000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.329269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.676 #52 NEW cov: 11809 ft: 14693 corp: 31/709b lim: 40 exec/s: 52 rss: 69Mb L: 19/40 MS: 1 ChangeBit- 00:08:11.676 [2024-11-28 07:33:22.369020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01ff91fb cdw11:ea036a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.369046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 #53 NEW cov: 11809 ft: 14700 corp: 32/721b lim: 40 exec/s: 53 rss: 69Mb L: 12/40 MS: 1 CMP- DE: "\377\221\373\352\003j*\""- 00:08:11.676 [2024-11-28 07:33:22.409409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a320100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.409435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.676 [2024-11-28 07:33:22.409563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.676 [2024-11-28 07:33:22.409580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.676 #54 NEW cov: 11809 ft: 14714 corp: 33/738b lim: 40 exec/s: 54 rss: 69Mb L: 17/40 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:11.934 [2024-11-28 07:33:22.449319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00002003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.449345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.934 #55 NEW cov: 11809 ft: 14743 corp: 34/749b lim: 40 exec/s: 55 rss: 69Mb L: 11/40 MS: 1 ChangeBit- 00:08:11.934 [2024-11-28 07:33:22.489873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000100 cdw11:ff91fbea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.489899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.490032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:155896fa cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.490049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.490178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00030000 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.490194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.934 #56 NEW cov: 11809 ft: 14754 corp: 35/776b lim: 40 exec/s: 56 rss: 69Mb L: 27/40 MS: 1 CMP- DE: "\377\221\373\352\025X\226\372"- 00:08:11.934 [2024-11-28 07:33:22.530372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.530398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.530535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0606 cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.530552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.530709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.530727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.530870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:5c585858 cdw11:58585800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.530889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.531023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:03e3e640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.531041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.934 #57 NEW cov: 11809 ft: 14769 corp: 36/816b lim: 40 exec/s: 57 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:11.934 [2024-11-28 07:33:22.580141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000506 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.580168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.580305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:060a0658 cdw11:58585858 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.580322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.580442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.580462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.934 #58 NEW cov: 11809 ft: 14781 corp: 37/845b lim: 40 exec/s: 58 rss: 69Mb L: 29/40 MS: 1 EraseBytes- 00:08:11.934 [2024-11-28 07:33:22.620090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a320100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.620116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.620242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.620257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.934 #59 NEW cov: 11809 ft: 14815 corp: 38/862b lim: 40 exec/s: 59 rss: 69Mb L: 17/40 MS: 1 ChangeBinInt- 00:08:11.934 [2024-11-28 07:33:22.670369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.670396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.670527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.670546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.934 [2024-11-28 07:33:22.670682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ff91 cdw11:fbea036a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.934 [2024-11-28 07:33:22.670699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.934 #60 NEW cov: 11809 ft: 14861 corp: 39/891b lim: 40 exec/s: 60 rss: 69Mb L: 29/40 MS: 1 PersAutoDict- DE: "\377\221\373\352\003j*\""- 00:08:12.194 [2024-11-28 07:33:22.710555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01060001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.710580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.194 [2024-11-28 07:33:22.710640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00010606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.710656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.194 [2024-11-28 07:33:22.710786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0a060606 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.710804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.194 #61 NEW cov: 11809 ft: 14865 corp: 40/918b lim: 40 exec/s: 61 rss: 69Mb L: 27/40 MS: 1 CrossOver- 00:08:12.194 [2024-11-28 07:33:22.750822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01ac0100 cdw11:00400001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.750850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.194 [2024-11-28 07:33:22.750971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:01010101 cdw11:01010101 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.750989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.194 [2024-11-28 07:33:22.751126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:01010101 cdw11:01000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.751142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.194 [2024-11-28 07:33:22.751273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0003c3e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.751291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.194 #62 NEW cov: 11809 ft: 14873 corp: 41/951b lim: 40 exec/s: 62 rss: 69Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:08:12.194 [2024-11-28 07:33:22.790561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000106 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.790586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.194 [2024-11-28 07:33:22.790708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:06060606 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.194 [2024-11-28 07:33:22.790728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.194 #63 NEW cov: 11809 ft: 14880 corp: 42/972b lim: 40 exec/s: 63 rss: 69Mb L: 21/40 MS: 1 EraseBytes- 00:08:12.194 #63 DONE cov: 11809 ft: 14880 corp: 42/972b lim: 40 exec/s: 31 rss: 69Mb 00:08:12.194 ###### Recommended dictionary. ###### 00:08:12.194 "\001\000\000\000\000\000\000\003" # Uses: 2 00:08:12.194 "\377\221\373\352\003j*\"" # Uses: 1 00:08:12.194 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:12.194 "\377\221\373\352\025X\226\372" # Uses: 0 00:08:12.194 ###### End of recommended dictionary. ###### 00:08:12.194 Done 63 runs in 2 second(s) 00:08:12.194 07:33:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:12.194 07:33:22 -- ../common.sh@72 -- # (( i++ )) 00:08:12.194 07:33:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.194 07:33:22 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:12.194 07:33:22 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:12.194 07:33:22 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.194 07:33:22 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.194 07:33:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:12.194 07:33:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:12.194 07:33:22 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:12.194 07:33:22 -- nvmf/run.sh@29 -- # port=4414 00:08:12.194 07:33:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:12.194 07:33:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:12.194 07:33:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.194 07:33:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:12.453 [2024-11-28 07:33:22.964469] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:12.453 [2024-11-28 07:33:22.964538] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661461 ] 00:08:12.453 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.453 [2024-11-28 07:33:23.138590] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.453 [2024-11-28 07:33:23.157970] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.453 [2024-11-28 07:33:23.158085] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.453 [2024-11-28 07:33:23.209292] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.712 [2024-11-28 07:33:23.225675] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:12.712 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.712 INFO: Seed: 46832778 00:08:12.712 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:12.712 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:12.712 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:12.712 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.712 #2 INITED exec/s: 0 rss: 59Mb 00:08:12.712 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.712 This may also happen if the target rejected all inputs we tried so far 00:08:12.712 [2024-11-28 07:33:23.281612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.712 [2024-11-28 07:33:23.281651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.712 [2024-11-28 07:33:23.281720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.712 [2024-11-28 07:33:23.281741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.712 [2024-11-28 07:33:23.281809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.712 [2024-11-28 07:33:23.281830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.712 [2024-11-28 07:33:23.281898] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.712 [2024-11-28 07:33:23.281918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.971 NEW_FUNC[1/671]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:12.971 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:12.971 #3 NEW cov: 11576 ft: 11577 corp: 2/29b lim: 35 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:12.971 [2024-11-28 07:33:23.592093] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.592149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.592242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.592271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.971 #13 NEW cov: 11696 ft: 12543 corp: 3/43b lim: 35 exec/s: 0 rss: 67Mb L: 14/28 MS: 5 ChangeByte-ChangeBit-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:12.971 [2024-11-28 07:33:23.642306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.642334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.642388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.642404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.642461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.642477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.642532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.642547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.971 #22 NEW cov: 11702 ft: 12786 corp: 4/72b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 4 InsertByte-InsertByte-EraseBytes-CrossOver- 00:08:12.971 [2024-11-28 07:33:23.682438] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.682466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.682525] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.682542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.682602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.682614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.682633] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.682648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.971 #23 NEW cov: 11787 ft: 13043 corp: 5/101b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeByte- 00:08:12.971 [2024-11-28 07:33:23.722519] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.722546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.722605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.722621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.722677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.722693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.971 [2024-11-28 07:33:23.722752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.971 [2024-11-28 07:33:23.722766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.230 #24 NEW cov: 11787 ft: 13084 corp: 6/132b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:13.230 [2024-11-28 07:33:23.762649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.230 [2024-11-28 07:33:23.762677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.230 [2024-11-28 07:33:23.762733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.230 [2024-11-28 07:33:23.762752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.230 [2024-11-28 07:33:23.762806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.230 [2024-11-28 07:33:23.762821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.230 [2024-11-28 07:33:23.762878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.230 [2024-11-28 07:33:23.762893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.230 #25 NEW cov: 11787 ft: 13151 corp: 7/165b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CrossOver- 00:08:13.230 [2024-11-28 07:33:23.802766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.230 [2024-11-28 07:33:23.802793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.230 [2024-11-28 07:33:23.802847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.802862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.802915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.802929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.802985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.803000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.231 #26 NEW cov: 11787 ft: 13293 corp: 8/193b lim: 35 exec/s: 0 rss: 68Mb L: 28/33 MS: 1 CopyPart- 00:08:13.231 [2024-11-28 07:33:23.842874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.842902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.842959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.842975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.843031] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.843045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.843101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.843115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.231 #27 NEW cov: 11787 ft: 13318 corp: 9/222b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 ShuffleBytes- 00:08:13.231 [2024-11-28 07:33:23.883044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.883073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.883148] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.883164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.883220] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.883233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.883289] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.883304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.231 #28 NEW cov: 11787 ft: 13409 corp: 10/255b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:13.231 [2024-11-28 07:33:23.923128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.923153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.923209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.923224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.923278] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.923294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.923349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.923364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.231 #29 NEW cov: 11787 ft: 13435 corp: 11/284b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 ShuffleBytes- 00:08:13.231 [2024-11-28 07:33:23.963286] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.963314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.963385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.963400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.963452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.963468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.231 [2024-11-28 07:33:23.963522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.231 [2024-11-28 07:33:23.963536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.231 #30 NEW cov: 11787 ft: 13489 corp: 12/312b lim: 35 exec/s: 0 rss: 68Mb L: 28/33 MS: 1 ChangeBit- 00:08:13.490 [2024-11-28 07:33:24.003318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.003349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.003405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.003421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.003476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.003492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.003547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.003560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.490 #31 NEW cov: 11787 ft: 13659 corp: 13/341b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 ShuffleBytes- 00:08:13.490 [2024-11-28 07:33:24.043456] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.043483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.043555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.043571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.043627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.043644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.043700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.043715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.490 #32 NEW cov: 11787 ft: 13691 corp: 14/371b lim: 35 exec/s: 0 rss: 68Mb L: 30/33 MS: 1 InsertByte- 00:08:13.490 [2024-11-28 07:33:24.083580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.490 [2024-11-28 07:33:24.083610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.490 [2024-11-28 07:33:24.083684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.083699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.083755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.083770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.083827] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.083842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.491 #33 NEW cov: 11787 ft: 13715 corp: 15/401b lim: 35 exec/s: 0 rss: 68Mb L: 30/33 MS: 1 ChangeBinInt- 00:08:13.491 [2024-11-28 07:33:24.123664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.123690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.123747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.123763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.123819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.123834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.123889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.123903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.491 #34 NEW cov: 11787 ft: 13737 corp: 16/430b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 ShuffleBytes- 00:08:13.491 [2024-11-28 07:33:24.163781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.163808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.163866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.163883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.163936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.163952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.164007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.164023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.491 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.491 #35 NEW cov: 11810 ft: 13857 corp: 17/459b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 ChangeByte- 00:08:13.491 [2024-11-28 07:33:24.203900] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.203927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.203984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.203999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.204054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.204068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.204124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.204143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.491 #36 NEW cov: 11810 ft: 13872 corp: 18/487b lim: 35 exec/s: 0 rss: 68Mb L: 28/33 MS: 1 ChangeByte- 00:08:13.491 [2024-11-28 07:33:24.244049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.244076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.244134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.244150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.244204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.244218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.491 [2024-11-28 07:33:24.244270] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.491 [2024-11-28 07:33:24.244285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.750 #37 NEW cov: 11810 ft: 13898 corp: 19/516b lim: 35 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 ChangeBinInt- 00:08:13.750 [2024-11-28 07:33:24.284111] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.750 [2024-11-28 07:33:24.284136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.750 [2024-11-28 07:33:24.284209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.750 [2024-11-28 07:33:24.284225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.750 [2024-11-28 07:33:24.284280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.750 [2024-11-28 07:33:24.284294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.750 [2024-11-28 07:33:24.284360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.750 [2024-11-28 07:33:24.284375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.750 #38 NEW cov: 11810 ft: 13927 corp: 20/546b lim: 35 exec/s: 38 rss: 68Mb L: 30/33 MS: 1 ChangeBinInt- 00:08:13.750 [2024-11-28 07:33:24.323947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.750 [2024-11-28 07:33:24.323974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.324030] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.324046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.751 #39 NEW cov: 11810 ft: 13969 corp: 21/563b lim: 35 exec/s: 39 rss: 68Mb L: 17/33 MS: 1 InsertRepeatedBytes- 00:08:13.751 [2024-11-28 07:33:24.364552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.364580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.364653] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000db SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.364670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.364727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.364742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.364797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.364814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.364868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.364884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.751 #40 NEW cov: 11810 ft: 14064 corp: 22/598b lim: 35 exec/s: 40 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:13.751 [2024-11-28 07:33:24.404178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.404205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.404261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.404276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.751 #41 NEW cov: 11810 ft: 14115 corp: 23/615b lim: 35 exec/s: 41 rss: 68Mb L: 17/35 MS: 1 ShuffleBytes- 00:08:13.751 [2024-11-28 07:33:24.444591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.444621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.444680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.444695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.444752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.444768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.444825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.444841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.751 #47 NEW cov: 11810 ft: 14135 corp: 24/648b lim: 35 exec/s: 47 rss: 68Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:13.751 [2024-11-28 07:33:24.474384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.474411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.474471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.474487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.751 #48 NEW cov: 11810 ft: 14169 corp: 25/665b lim: 35 exec/s: 48 rss: 69Mb L: 17/35 MS: 1 EraseBytes- 00:08:13.751 [2024-11-28 07:33:24.514489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.514514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.751 [2024-11-28 07:33:24.514570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000046 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.751 [2024-11-28 07:33:24.514584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.010 #49 NEW cov: 11810 ft: 14220 corp: 26/679b lim: 35 exec/s: 49 rss: 69Mb L: 14/35 MS: 1 ChangeBinInt- 00:08:14.010 [2024-11-28 07:33:24.554921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.554948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.010 [2024-11-28 07:33:24.555003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.555019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.010 [2024-11-28 07:33:24.555075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.555090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.010 [2024-11-28 07:33:24.555145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.555160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.010 #50 NEW cov: 11810 ft: 14280 corp: 27/713b lim: 35 exec/s: 50 rss: 69Mb L: 34/35 MS: 1 CrossOver- 00:08:14.010 [2024-11-28 07:33:24.595066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.595094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.010 [2024-11-28 07:33:24.595150] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.595166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.010 [2024-11-28 07:33:24.595220] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.595234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.010 [2024-11-28 07:33:24.595290] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.010 [2024-11-28 07:33:24.595305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.010 #51 NEW cov: 11810 ft: 14297 corp: 28/742b lim: 35 exec/s: 51 rss: 69Mb L: 29/35 MS: 1 ChangeBit- 00:08:14.010 [2024-11-28 07:33:24.634980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.635011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.635072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.635087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.635145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.635160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.011 #52 NEW cov: 11810 ft: 14473 corp: 29/767b lim: 35 exec/s: 52 rss: 69Mb L: 25/35 MS: 1 EraseBytes- 00:08:14.011 [2024-11-28 07:33:24.675295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.675322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.675379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.675392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.675448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.675463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.675518] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000061 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.675532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.011 #53 NEW cov: 11810 ft: 14482 corp: 30/800b lim: 35 exec/s: 53 rss: 69Mb L: 33/35 MS: 1 ChangeByte- 00:08:14.011 [2024-11-28 07:33:24.715238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.715265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.715325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.715339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.715396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.715410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.011 #54 NEW cov: 11810 ft: 14508 corp: 31/821b lim: 35 exec/s: 54 rss: 69Mb L: 21/35 MS: 1 EraseBytes- 00:08:14.011 [2024-11-28 07:33:24.755503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.755530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.755587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.755607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.755669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.755684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.011 [2024-11-28 07:33:24.755744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.011 [2024-11-28 07:33:24.755759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.011 #55 NEW cov: 11810 ft: 14573 corp: 32/853b lim: 35 exec/s: 55 rss: 69Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:08:14.270 [2024-11-28 07:33:24.795584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.270 [2024-11-28 07:33:24.795615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.270 [2024-11-28 07:33:24.795690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000003d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.270 [2024-11-28 07:33:24.795705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.270 [2024-11-28 07:33:24.795776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.270 [2024-11-28 07:33:24.795792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.270 [2024-11-28 07:33:24.795848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.270 [2024-11-28 07:33:24.795864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.270 #56 NEW cov: 11810 ft: 14586 corp: 33/883b lim: 35 exec/s: 56 rss: 69Mb L: 30/35 MS: 1 ChangeBinInt- 00:08:14.270 [2024-11-28 07:33:24.835772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.270 [2024-11-28 07:33:24.835800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.270 [2024-11-28 07:33:24.835874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.270 [2024-11-28 07:33:24.835890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.835946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.835961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.836018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.836033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.271 #57 NEW cov: 11810 ft: 14588 corp: 34/917b lim: 35 exec/s: 57 rss: 69Mb L: 34/35 MS: 1 CrossOver- 00:08:14.271 [2024-11-28 07:33:24.875845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.875873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.875930] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.875949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.876003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.876016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.876071] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.876087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.271 #58 NEW cov: 11810 ft: 14624 corp: 35/950b lim: 35 exec/s: 58 rss: 69Mb L: 33/35 MS: 1 ChangeBit- 00:08:14.271 [2024-11-28 07:33:24.915623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.915652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.915711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.915727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.271 #59 NEW cov: 11810 ft: 14636 corp: 36/967b lim: 35 exec/s: 59 rss: 69Mb L: 17/35 MS: 1 ChangeBit- 00:08:14.271 [2024-11-28 07:33:24.955787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.955814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.955869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.955883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.271 NEW_FUNC[1/1]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:14.271 #60 NEW cov: 11820 ft: 14661 corp: 37/982b lim: 35 exec/s: 60 rss: 69Mb L: 15/35 MS: 1 CrossOver- 00:08:14.271 [2024-11-28 07:33:24.995874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.995902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.271 [2024-11-28 07:33:24.995956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:24.995970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.271 #61 NEW cov: 11820 ft: 14699 corp: 38/998b lim: 35 exec/s: 61 rss: 69Mb L: 16/35 MS: 1 EraseBytes- 00:08:14.271 [2024-11-28 07:33:25.035851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.271 [2024-11-28 07:33:25.035877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.528 #62 NEW cov: 11820 ft: 15378 corp: 39/1010b lim: 35 exec/s: 62 rss: 69Mb L: 12/35 MS: 1 EraseBytes- 00:08:14.528 [2024-11-28 07:33:25.086446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.528 [2024-11-28 07:33:25.086474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.528 [2024-11-28 07:33:25.086536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.528 [2024-11-28 07:33:25.086552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.528 [2024-11-28 07:33:25.086610] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.528 [2024-11-28 07:33:25.086626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.528 [2024-11-28 07:33:25.086683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.528 [2024-11-28 07:33:25.086699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.528 #63 NEW cov: 11820 ft: 15386 corp: 40/1040b lim: 35 exec/s: 63 rss: 69Mb L: 30/35 MS: 1 ChangeByte- 00:08:14.528 [2024-11-28 07:33:25.126532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.528 [2024-11-28 07:33:25.126559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.528 [2024-11-28 07:33:25.126615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.528 [2024-11-28 07:33:25.126631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.126684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.126699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.126756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.126771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.529 #64 NEW cov: 11820 ft: 15413 corp: 41/1074b lim: 35 exec/s: 64 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:08:14.529 [2024-11-28 07:33:25.166562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.166589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.166668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.166684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.166740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.166755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.529 #65 NEW cov: 11820 ft: 15461 corp: 42/1100b lim: 35 exec/s: 65 rss: 70Mb L: 26/35 MS: 1 InsertByte- 00:08:14.529 [2024-11-28 07:33:25.206777] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.206804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.206865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.206883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.206944] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.206960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.207021] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.207036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.529 #71 NEW cov: 11820 ft: 15467 corp: 43/1129b lim: 35 exec/s: 71 rss: 70Mb L: 29/35 MS: 1 CopyPart- 00:08:14.529 [2024-11-28 07:33:25.246901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.246928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.246977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.246993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.247051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.247066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.529 [2024-11-28 07:33:25.247123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.529 [2024-11-28 07:33:25.247139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.529 #72 NEW cov: 11820 ft: 15480 corp: 44/1163b lim: 35 exec/s: 36 rss: 70Mb L: 34/35 MS: 1 ShuffleBytes- 00:08:14.529 #72 DONE cov: 11820 ft: 15480 corp: 44/1163b lim: 35 exec/s: 36 rss: 70Mb 00:08:14.529 Done 72 runs in 2 second(s) 00:08:14.787 07:33:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:14.787 07:33:25 -- ../common.sh@72 -- # (( i++ )) 00:08:14.787 07:33:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.787 07:33:25 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:14.787 07:33:25 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:14.787 07:33:25 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.787 07:33:25 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.787 07:33:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:14.787 07:33:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:14.787 07:33:25 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:14.787 07:33:25 -- nvmf/run.sh@29 -- # port=4415 00:08:14.787 07:33:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:14.787 07:33:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:14.787 07:33:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.787 07:33:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:14.787 [2024-11-28 07:33:25.428513] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:14.787 [2024-11-28 07:33:25.428617] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661499 ] 00:08:14.787 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.046 [2024-11-28 07:33:25.610499] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.046 [2024-11-28 07:33:25.630090] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.046 [2024-11-28 07:33:25.630220] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.046 [2024-11-28 07:33:25.681452] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.046 [2024-11-28 07:33:25.697800] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:15.046 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.046 INFO: Seed: 2516831624 00:08:15.046 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:15.046 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:15.046 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:15.046 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.046 #2 INITED exec/s: 0 rss: 59Mb 00:08:15.046 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.046 This may also happen if the target rejected all inputs we tried so far 00:08:15.046 [2024-11-28 07:33:25.743312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000230 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.046 [2024-11-28 07:33:25.743340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.046 [2024-11-28 07:33:25.743412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.046 [2024-11-28 07:33:25.743426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.046 [2024-11-28 07:33:25.743478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.046 [2024-11-28 07:33:25.743492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.046 [2024-11-28 07:33:25.743545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.046 [2024-11-28 07:33:25.743559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.305 NEW_FUNC[1/669]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:15.305 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.305 #5 NEW cov: 11557 ft: 11565 corp: 2/31b lim: 35 exec/s: 0 rss: 66Mb L: 30/30 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:15.305 [2024-11-28 07:33:26.063962] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.305 [2024-11-28 07:33:26.063992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.305 [2024-11-28 07:33:26.064049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.305 [2024-11-28 07:33:26.064062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.305 [2024-11-28 07:33:26.064119] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.305 [2024-11-28 07:33:26.064132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.564 NEW_FUNC[1/1]: 0x1c72df8 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1057 00:08:15.564 #18 NEW cov: 11677 ft: 12618 corp: 3/58b lim: 35 exec/s: 0 rss: 67Mb L: 27/30 MS: 3 InsertByte-ShuffleBytes-CrossOver- 00:08:15.564 [2024-11-28 07:33:26.103996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.104022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.104094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.104108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.104162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.104175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.564 #23 NEW cov: 11683 ft: 12858 corp: 4/82b lim: 35 exec/s: 0 rss: 67Mb L: 24/30 MS: 5 ChangeBinInt-ShuffleBytes-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:15.564 [2024-11-28 07:33:26.144128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.144154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.144227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.144241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.144297] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.144310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.564 #24 NEW cov: 11768 ft: 13087 corp: 5/109b lim: 35 exec/s: 0 rss: 67Mb L: 27/30 MS: 1 CopyPart- 00:08:15.564 [2024-11-28 07:33:26.184363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.184388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.184444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.184457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.184514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.184527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.184582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.184595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.564 #25 NEW cov: 11768 ft: 13136 corp: 6/137b lim: 35 exec/s: 0 rss: 67Mb L: 28/30 MS: 1 InsertByte- 00:08:15.564 [2024-11-28 07:33:26.224086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000123 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.224113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.564 #27 NEW cov: 11768 ft: 13559 corp: 7/144b lim: 35 exec/s: 0 rss: 67Mb L: 7/30 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:15.564 [2024-11-28 07:33:26.264467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.264491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.264548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.264562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.264620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.264634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.564 #28 NEW cov: 11768 ft: 13647 corp: 8/168b lim: 35 exec/s: 0 rss: 67Mb L: 24/30 MS: 1 CMP- DE: "\001\000\002\000"- 00:08:15.564 [2024-11-28 07:33:26.304617] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.304642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.304700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.304714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.564 [2024-11-28 07:33:26.304769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.564 [2024-11-28 07:33:26.304782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.564 #29 NEW cov: 11768 ft: 13704 corp: 9/192b lim: 35 exec/s: 0 rss: 67Mb L: 24/30 MS: 1 ShuffleBytes- 00:08:15.823 [2024-11-28 07:33:26.344908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000230 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.823 [2024-11-28 07:33:26.344933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.344987] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.345001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.345055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.345069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.345122] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.345135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.824 #30 NEW cov: 11768 ft: 13791 corp: 10/222b lim: 35 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 ShuffleBytes- 00:08:15.824 [2024-11-28 07:33:26.384869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.384894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.384952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.384966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.385024] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.385037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.824 #31 NEW cov: 11768 ft: 13825 corp: 11/246b lim: 35 exec/s: 0 rss: 67Mb L: 24/30 MS: 1 ChangeByte- 00:08:15.824 [2024-11-28 07:33:26.425113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000230 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.425138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.425209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.425223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.425275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.425288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.425342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.425356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.824 #32 NEW cov: 11768 ft: 13834 corp: 12/276b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 ChangeBinInt- 00:08:15.824 [2024-11-28 07:33:26.465191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.465215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.465273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.465286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.465343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.465356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.465413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000276 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.465425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.824 #33 NEW cov: 11768 ft: 13870 corp: 13/304b lim: 35 exec/s: 0 rss: 68Mb L: 28/30 MS: 1 ChangeBit- 00:08:15.824 [2024-11-28 07:33:26.504952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.504976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 #34 NEW cov: 11768 ft: 13928 corp: 14/311b lim: 35 exec/s: 0 rss: 69Mb L: 7/30 MS: 1 ChangeBit- 00:08:15.824 [2024-11-28 07:33:26.545352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.545379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.545437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.545450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.545506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.545519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.824 #35 NEW cov: 11768 ft: 13948 corp: 15/337b lim: 35 exec/s: 0 rss: 69Mb L: 26/30 MS: 1 EraseBytes- 00:08:15.824 [2024-11-28 07:33:26.585306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.585329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.824 [2024-11-28 07:33:26.585402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.824 [2024-11-28 07:33:26.585416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.083 #36 NEW cov: 11768 ft: 14161 corp: 16/351b lim: 35 exec/s: 0 rss: 69Mb L: 14/30 MS: 1 EraseBytes- 00:08:16.083 [2024-11-28 07:33:26.625513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.625537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.625594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.625612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.625666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.625679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.083 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.083 #37 NEW cov: 11791 ft: 14203 corp: 17/375b lim: 35 exec/s: 0 rss: 69Mb L: 24/30 MS: 1 ChangeBinInt- 00:08:16.083 [2024-11-28 07:33:26.665646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.665671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.665726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.665739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.665792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.665805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.083 #38 NEW cov: 11791 ft: 14231 corp: 18/399b lim: 35 exec/s: 0 rss: 69Mb L: 24/30 MS: 1 ChangeBit- 00:08:16.083 [2024-11-28 07:33:26.705894] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006dd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.705922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.705978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006dd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.705991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.706045] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006dd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.706058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.083 [2024-11-28 07:33:26.706112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006dd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.083 [2024-11-28 07:33:26.706124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.083 #40 NEW cov: 11791 ft: 14236 corp: 19/431b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:16.084 [2024-11-28 07:33:26.735926] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.735950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.736005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.736018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.736060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.736074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.736131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.736144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.084 #41 NEW cov: 11791 ft: 14323 corp: 20/461b lim: 35 exec/s: 41 rss: 69Mb L: 30/32 MS: 1 InsertRepeatedBytes- 00:08:16.084 [2024-11-28 07:33:26.775827] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.775853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.775911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.775925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.084 #42 NEW cov: 11791 ft: 14330 corp: 21/481b lim: 35 exec/s: 42 rss: 69Mb L: 20/32 MS: 1 EraseBytes- 00:08:16.084 [2024-11-28 07:33:26.816166] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.816191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.816247] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.816260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.816315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.816330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.084 [2024-11-28 07:33:26.816384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.084 [2024-11-28 07:33:26.816398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.084 #43 NEW cov: 11791 ft: 14337 corp: 22/511b lim: 35 exec/s: 43 rss: 69Mb L: 30/32 MS: 1 ChangeBinInt- 00:08:16.342 [2024-11-28 07:33:26.856333] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.856358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.342 [2024-11-28 07:33:26.856414] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.856428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.342 [2024-11-28 07:33:26.856460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.856474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.342 [2024-11-28 07:33:26.856528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.856541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.342 #44 NEW cov: 11791 ft: 14348 corp: 23/544b lim: 35 exec/s: 44 rss: 69Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:16.342 [2024-11-28 07:33:26.896439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.896464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.342 [2024-11-28 07:33:26.896519] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.896533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.342 [2024-11-28 07:33:26.896586] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.896602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.342 [2024-11-28 07:33:26.896658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.896671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.342 #45 NEW cov: 11791 ft: 14427 corp: 24/572b lim: 35 exec/s: 45 rss: 69Mb L: 28/33 MS: 1 ChangeBit- 00:08:16.342 [2024-11-28 07:33:26.936444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.342 [2024-11-28 07:33:26.936469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:26.936525] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:26.936539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:26.936593] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:26.936615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.343 #46 NEW cov: 11791 ft: 14463 corp: 25/596b lim: 35 exec/s: 46 rss: 69Mb L: 24/33 MS: 1 ChangeBinInt- 00:08:16.343 [2024-11-28 07:33:26.976419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:26.976444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:26.976516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:26.976530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.343 #47 NEW cov: 11791 ft: 14517 corp: 26/615b lim: 35 exec/s: 47 rss: 69Mb L: 19/33 MS: 1 EraseBytes- 00:08:16.343 [2024-11-28 07:33:27.016448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000123 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.016473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.343 #48 NEW cov: 11791 ft: 14525 corp: 27/623b lim: 35 exec/s: 48 rss: 69Mb L: 8/33 MS: 1 InsertByte- 00:08:16.343 [2024-11-28 07:33:27.056799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.056824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:27.056886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.056899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.343 NEW_FUNC[1/1]: 0x481108 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:08:16.343 #49 NEW cov: 11814 ft: 14561 corp: 28/650b lim: 35 exec/s: 49 rss: 69Mb L: 27/33 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:08:16.343 [2024-11-28 07:33:27.097029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.097054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:27.097112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.097126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:27.097181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.097195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.343 [2024-11-28 07:33:27.097248] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000276 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.343 [2024-11-28 07:33:27.097260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.601 #50 NEW cov: 11814 ft: 14573 corp: 29/682b lim: 35 exec/s: 50 rss: 69Mb L: 32/33 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:08:16.601 [2024-11-28 07:33:27.137059] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.137084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.137147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.137161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.137217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.137230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.601 #51 NEW cov: 11814 ft: 14608 corp: 30/706b lim: 35 exec/s: 51 rss: 69Mb L: 24/33 MS: 1 CopyPart- 00:08:16.601 [2024-11-28 07:33:27.177140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.177165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.177240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.177254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.177311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007fb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.177324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.601 #52 NEW cov: 11814 ft: 14612 corp: 31/730b lim: 35 exec/s: 52 rss: 69Mb L: 24/33 MS: 1 ChangeBinInt- 00:08:16.601 [2024-11-28 07:33:27.217231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.217256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.217312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.217326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.217381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.217394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.601 #53 NEW cov: 11814 ft: 14706 corp: 32/754b lim: 35 exec/s: 53 rss: 69Mb L: 24/33 MS: 1 CMP- DE: "\000\222\373\361\314\272\002\""- 00:08:16.601 [2024-11-28 07:33:27.257376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.257400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.257458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.257472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.601 NEW_FUNC[1/1]: 0x47fd88 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:16.601 #54 NEW cov: 11852 ft: 14767 corp: 33/777b lim: 35 exec/s: 54 rss: 69Mb L: 23/33 MS: 1 PersAutoDict- DE: "\001\000\002\000"- 00:08:16.601 [2024-11-28 07:33:27.297237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000523 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.297264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.601 #55 NEW cov: 11852 ft: 14789 corp: 34/785b lim: 35 exec/s: 55 rss: 69Mb L: 8/33 MS: 1 InsertByte- 00:08:16.601 [2024-11-28 07:33:27.337566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.337591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.601 [2024-11-28 07:33:27.337656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.601 [2024-11-28 07:33:27.337670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.602 [2024-11-28 07:33:27.337726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.602 [2024-11-28 07:33:27.337740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.602 #56 NEW cov: 11852 ft: 14804 corp: 35/812b lim: 35 exec/s: 56 rss: 69Mb L: 27/33 MS: 1 CopyPart- 00:08:16.860 [2024-11-28 07:33:27.377735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.377760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.860 [2024-11-28 07:33:27.377819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.377833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.860 [2024-11-28 07:33:27.377890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.377904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.860 #57 NEW cov: 11852 ft: 14821 corp: 36/836b lim: 35 exec/s: 57 rss: 69Mb L: 24/33 MS: 1 ChangeByte- 00:08:16.860 [2024-11-28 07:33:27.417927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.417952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.860 [2024-11-28 07:33:27.418007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.418021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.860 [2024-11-28 07:33:27.418077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.418090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.860 [2024-11-28 07:33:27.418146] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.418159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.860 #58 NEW cov: 11852 ft: 14827 corp: 37/864b lim: 35 exec/s: 58 rss: 70Mb L: 28/33 MS: 1 InsertByte- 00:08:16.860 [2024-11-28 07:33:27.458090] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.860 [2024-11-28 07:33:27.458114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.458174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.458188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.458260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.458273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.458328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.458341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.861 #59 NEW cov: 11852 ft: 14837 corp: 38/892b lim: 35 exec/s: 59 rss: 70Mb L: 28/33 MS: 1 CopyPart- 00:08:16.861 [2024-11-28 07:33:27.498191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.498215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.498325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.498340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.498394] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.498407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.861 #60 NEW cov: 11852 ft: 14862 corp: 39/920b lim: 35 exec/s: 60 rss: 70Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:08:16.861 [2024-11-28 07:33:27.538131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.538156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.538212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.538226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.538283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.538295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.861 #61 NEW cov: 11852 ft: 14869 corp: 40/944b lim: 35 exec/s: 61 rss: 70Mb L: 24/33 MS: 1 ChangeBinInt- 00:08:16.861 [2024-11-28 07:33:27.578120] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.578145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 NEW_FUNC[1/1]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:16.861 #63 NEW cov: 11866 ft: 14905 corp: 41/963b lim: 35 exec/s: 63 rss: 70Mb L: 19/33 MS: 2 CopyPart-CrossOver- 00:08:16.861 [2024-11-28 07:33:27.618462] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000230 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.618486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.618547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.618561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.618618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.618631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.861 [2024-11-28 07:33:27.618686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-11-28 07:33:27.618698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.120 #64 NEW cov: 11866 ft: 14910 corp: 42/996b lim: 35 exec/s: 64 rss: 70Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:17.120 [2024-11-28 07:33:27.658693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.658717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.658774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.658787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.658840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.658853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.658909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.658922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.658975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000260 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.658988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.120 #65 NEW cov: 11866 ft: 14956 corp: 43/1031b lim: 35 exec/s: 65 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:17.120 [2024-11-28 07:33:27.698707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000230 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.698732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.698789] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.698803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.698858] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.698871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.698926] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.698938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.120 #66 NEW cov: 11866 ft: 14965 corp: 44/1064b lim: 35 exec/s: 66 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:08:17.120 [2024-11-28 07:33:27.738717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.738742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.738799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.738812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.120 [2024-11-28 07:33:27.738869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-11-28 07:33:27.738881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.120 #67 NEW cov: 11866 ft: 14983 corp: 45/1088b lim: 35 exec/s: 33 rss: 70Mb L: 24/35 MS: 1 CopyPart- 00:08:17.120 #67 DONE cov: 11866 ft: 14983 corp: 45/1088b lim: 35 exec/s: 33 rss: 70Mb 00:08:17.120 ###### Recommended dictionary. ###### 00:08:17.120 "\001\000\002\000" # Uses: 3 00:08:17.120 "\000\222\373\361\314\272\002\"" # Uses: 0 00:08:17.120 ###### End of recommended dictionary. ###### 00:08:17.120 Done 67 runs in 2 second(s) 00:08:17.120 07:33:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:17.120 07:33:27 -- ../common.sh@72 -- # (( i++ )) 00:08:17.120 07:33:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.120 07:33:27 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:17.120 07:33:27 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:17.120 07:33:27 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.120 07:33:27 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.120 07:33:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:17.120 07:33:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:17.120 07:33:27 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:17.120 07:33:27 -- nvmf/run.sh@29 -- # port=4416 00:08:17.120 07:33:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:17.120 07:33:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:17.120 07:33:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.379 07:33:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:17.379 [2024-11-28 07:33:27.920364] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:17.379 [2024-11-28 07:33:27.920451] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661542 ] 00:08:17.379 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.379 [2024-11-28 07:33:28.097175] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.379 [2024-11-28 07:33:28.116558] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.379 [2024-11-28 07:33:28.116701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.637 [2024-11-28 07:33:28.167954] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.637 [2024-11-28 07:33:28.184312] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:17.637 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.637 INFO: Seed: 708864629 00:08:17.637 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:17.637 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:17.637 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:17.637 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.637 #2 INITED exec/s: 0 rss: 59Mb 00:08:17.637 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.637 This may also happen if the target rejected all inputs we tried so far 00:08:17.637 [2024-11-28 07:33:28.260277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-28 07:33:28.260320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.637 [2024-11-28 07:33:28.260450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-28 07:33:28.260474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.896 NEW_FUNC[1/671]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:17.896 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:17.896 #5 NEW cov: 11667 ft: 11664 corp: 2/63b lim: 105 exec/s: 0 rss: 67Mb L: 62/62 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:17.896 [2024-11-28 07:33:28.580808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.896 [2024-11-28 07:33:28.580863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.896 #6 NEW cov: 11780 ft: 12722 corp: 3/103b lim: 105 exec/s: 0 rss: 67Mb L: 40/62 MS: 1 EraseBytes- 00:08:17.896 [2024-11-28 07:33:28.631446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.896 [2024-11-28 07:33:28.631481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.896 [2024-11-28 07:33:28.631569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.896 [2024-11-28 07:33:28.631588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.896 [2024-11-28 07:33:28.631708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.896 [2024-11-28 07:33:28.631729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.896 [2024-11-28 07:33:28.631844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.896 [2024-11-28 07:33:28.631870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.896 #18 NEW cov: 11786 ft: 13573 corp: 4/199b lim: 105 exec/s: 0 rss: 67Mb L: 96/96 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:18.155 [2024-11-28 07:33:28.671586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.671620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.671723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.671744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.671861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.671884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.671994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.672017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.155 #24 NEW cov: 11871 ft: 13817 corp: 5/295b lim: 105 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 ChangeByte- 00:08:18.155 [2024-11-28 07:33:28.721169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.721198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.155 #25 NEW cov: 11871 ft: 13945 corp: 6/335b lim: 105 exec/s: 0 rss: 67Mb L: 40/96 MS: 1 ChangeBit- 00:08:18.155 [2024-11-28 07:33:28.761908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.761941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.762041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.762064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.762181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.762205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.762319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.762341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.155 #26 NEW cov: 11871 ft: 14060 corp: 7/423b lim: 105 exec/s: 0 rss: 67Mb L: 88/96 MS: 1 EraseBytes- 00:08:18.155 [2024-11-28 07:33:28.801987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.802018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.802094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.802115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.802231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.802255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.802371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.802395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.155 #27 NEW cov: 11871 ft: 14106 corp: 8/519b lim: 105 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 ChangeBinInt- 00:08:18.155 [2024-11-28 07:33:28.851804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:184483840 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.851848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.851969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.851987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.155 #28 NEW cov: 11871 ft: 14136 corp: 9/581b lim: 105 exec/s: 0 rss: 67Mb L: 62/96 MS: 1 ChangeBinInt- 00:08:18.155 [2024-11-28 07:33:28.892296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.892327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.892412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.892430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.892545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.892567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.155 [2024-11-28 07:33:28.892693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.155 [2024-11-28 07:33:28.892714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.155 #29 NEW cov: 11871 ft: 14181 corp: 10/677b lim: 105 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 ChangeByte- 00:08:18.414 [2024-11-28 07:33:28.942025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:184483840 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.942056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:28.942173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.942195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.414 #30 NEW cov: 11871 ft: 14230 corp: 11/739b lim: 105 exec/s: 0 rss: 67Mb L: 62/96 MS: 1 ChangeByte- 00:08:18.414 [2024-11-28 07:33:28.982751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.982785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:28.982869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.982895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:28.983010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.983036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:28.983154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.983175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:28.983298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65532 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:28.983319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:18.414 #31 NEW cov: 11871 ft: 14278 corp: 12/844b lim: 105 exec/s: 0 rss: 68Mb L: 105/105 MS: 1 CrossOver- 00:08:18.414 [2024-11-28 07:33:29.032465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.032498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.032595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.032624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.032741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.032762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.414 #32 NEW cov: 11871 ft: 14550 corp: 13/913b lim: 105 exec/s: 0 rss: 68Mb L: 69/105 MS: 1 EraseBytes- 00:08:18.414 [2024-11-28 07:33:29.072657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.072691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.072821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.072845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.072958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.072979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.414 #33 NEW cov: 11871 ft: 14590 corp: 14/982b lim: 105 exec/s: 0 rss: 68Mb L: 69/105 MS: 1 CopyPart- 00:08:18.414 [2024-11-28 07:33:29.122353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18428729675200069631 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.122386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.414 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.414 #34 NEW cov: 11894 ft: 14648 corp: 15/1022b lim: 105 exec/s: 0 rss: 68Mb L: 40/105 MS: 1 CopyPart- 00:08:18.414 [2024-11-28 07:33:29.173102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.173138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.173240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.173262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.173375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551399 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.173396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.414 [2024-11-28 07:33:29.173504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.414 [2024-11-28 07:33:29.173526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.673 #35 NEW cov: 11894 ft: 14671 corp: 16/1118b lim: 105 exec/s: 0 rss: 68Mb L: 96/105 MS: 1 ChangeByte- 00:08:18.673 [2024-11-28 07:33:29.213027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.213059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.213153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.213173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.213291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.213315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 #36 NEW cov: 11894 ft: 14686 corp: 17/1182b lim: 105 exec/s: 36 rss: 68Mb L: 64/105 MS: 1 EraseBytes- 00:08:18.673 [2024-11-28 07:33:29.253220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.253250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.253374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.253400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.253529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.253553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 #37 NEW cov: 11894 ft: 14731 corp: 18/1246b lim: 105 exec/s: 37 rss: 68Mb L: 64/105 MS: 1 ChangeBinInt- 00:08:18.673 [2024-11-28 07:33:29.293363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.293389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.293522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.293547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.293668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.293692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 #38 NEW cov: 11894 ft: 14760 corp: 19/1310b lim: 105 exec/s: 38 rss: 68Mb L: 64/105 MS: 1 ChangeBit- 00:08:18.673 [2024-11-28 07:33:29.333371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.333402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.333493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.333516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.333637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:71468255805184 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.333658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 #39 NEW cov: 11894 ft: 14773 corp: 20/1374b lim: 105 exec/s: 39 rss: 68Mb L: 64/105 MS: 1 ChangeBinInt- 00:08:18.673 [2024-11-28 07:33:29.383670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.383701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.383784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.383824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.383940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.383966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.384082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2161727821137838079 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.384104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.673 #40 NEW cov: 11894 ft: 14775 corp: 21/1470b lim: 105 exec/s: 40 rss: 68Mb L: 96/105 MS: 1 ChangeByte- 00:08:18.673 [2024-11-28 07:33:29.423688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.423725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.423814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.423841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-11-28 07:33:29.423963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-11-28 07:33:29.423993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 #41 NEW cov: 11894 ft: 14794 corp: 22/1535b lim: 105 exec/s: 41 rss: 68Mb L: 65/105 MS: 1 InsertByte- 00:08:18.932 [2024-11-28 07:33:29.463781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.463813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.463885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.463907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.464032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.464054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 #42 NEW cov: 11894 ft: 14809 corp: 23/1604b lim: 105 exec/s: 42 rss: 68Mb L: 69/105 MS: 1 ShuffleBytes- 00:08:18.932 [2024-11-28 07:33:29.513949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.513985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.514103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.514125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.514249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:71468255805184 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.514274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 #43 NEW cov: 11894 ft: 14826 corp: 24/1668b lim: 105 exec/s: 43 rss: 68Mb L: 64/105 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:18.932 [2024-11-28 07:33:29.564059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.564094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.564200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.564222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.564356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073693495295 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.564381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 #44 NEW cov: 11894 ft: 14840 corp: 25/1732b lim: 105 exec/s: 44 rss: 69Mb L: 64/105 MS: 1 CrossOver- 00:08:18.932 [2024-11-28 07:33:29.614069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16714240 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.614100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.614232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.614255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 #45 NEW cov: 11894 ft: 14920 corp: 26/1794b lim: 105 exec/s: 45 rss: 69Mb L: 62/105 MS: 1 ShuffleBytes- 00:08:18.932 [2024-11-28 07:33:29.654087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.654118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.654247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.654269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 #46 NEW cov: 11894 ft: 14929 corp: 27/1846b lim: 105 exec/s: 46 rss: 69Mb L: 52/105 MS: 1 EraseBytes- 00:08:18.932 [2024-11-28 07:33:29.694575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.694610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.694710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.694732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.694849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.694871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 [2024-11-28 07:33:29.694989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744069414584575 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-11-28 07:33:29.695012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.192 #47 NEW cov: 11894 ft: 14950 corp: 28/1942b lim: 105 exec/s: 47 rss: 69Mb L: 96/105 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:19.192 [2024-11-28 07:33:29.734468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.734500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.734624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.734646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.192 #48 NEW cov: 11894 ft: 14972 corp: 29/1999b lim: 105 exec/s: 48 rss: 69Mb L: 57/105 MS: 1 EraseBytes- 00:08:19.192 [2024-11-28 07:33:29.774514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.774540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.774653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.774681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.192 #49 NEW cov: 11894 ft: 14978 corp: 30/2061b lim: 105 exec/s: 49 rss: 69Mb L: 62/105 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:19.192 [2024-11-28 07:33:29.814759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.814790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.814851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.814870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.814985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.815008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.192 #50 NEW cov: 11894 ft: 14985 corp: 31/2138b lim: 105 exec/s: 50 rss: 69Mb L: 77/105 MS: 1 CopyPart- 00:08:19.192 [2024-11-28 07:33:29.854932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.854965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.855064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.855086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.855154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.855177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.192 #51 NEW cov: 11894 ft: 14986 corp: 32/2202b lim: 105 exec/s: 51 rss: 69Mb L: 64/105 MS: 1 ChangeBinInt- 00:08:19.192 [2024-11-28 07:33:29.894662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18428729675200069631 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.894692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.192 #57 NEW cov: 11894 ft: 14991 corp: 33/2242b lim: 105 exec/s: 57 rss: 69Mb L: 40/105 MS: 1 CopyPart- 00:08:19.192 [2024-11-28 07:33:29.935398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.935428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.935551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.935572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.935686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.935711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.192 [2024-11-28 07:33:29.935846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.192 [2024-11-28 07:33:29.935866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.192 #63 NEW cov: 11894 ft: 14999 corp: 34/2341b lim: 105 exec/s: 63 rss: 69Mb L: 99/105 MS: 1 CopyPart- 00:08:19.452 [2024-11-28 07:33:29.975093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:29.975119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:29.975253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:29.975276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.452 #64 NEW cov: 11894 ft: 15006 corp: 35/2403b lim: 105 exec/s: 64 rss: 69Mb L: 62/105 MS: 1 ChangeBinInt- 00:08:19.452 [2024-11-28 07:33:30.025424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.025455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.025542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4294902016 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.025563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.025680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744069414584320 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.025703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.452 #65 NEW cov: 11894 ft: 15025 corp: 36/2475b lim: 105 exec/s: 65 rss: 69Mb L: 72/105 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:19.452 [2024-11-28 07:33:30.075318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18428729675200069631 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.075352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.452 #66 NEW cov: 11894 ft: 15045 corp: 37/2516b lim: 105 exec/s: 66 rss: 69Mb L: 41/105 MS: 1 InsertByte- 00:08:19.452 [2024-11-28 07:33:30.125833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070085672959 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.125864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.125955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446463698261245951 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.125974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.126091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4294967295 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.126113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.452 #67 NEW cov: 11894 ft: 15050 corp: 38/2588b lim: 105 exec/s: 67 rss: 69Mb L: 72/105 MS: 1 CopyPart- 00:08:19.452 [2024-11-28 07:33:30.165747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.165782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.165895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.165917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.452 #68 NEW cov: 11894 ft: 15112 corp: 39/2640b lim: 105 exec/s: 68 rss: 69Mb L: 52/105 MS: 1 EraseBytes- 00:08:19.452 [2024-11-28 07:33:30.206038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070080757759 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.206071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.206187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.206207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.452 [2024-11-28 07:33:30.206323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073693495295 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.452 [2024-11-28 07:33:30.206359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.711 #69 NEW cov: 11894 ft: 15122 corp: 40/2704b lim: 105 exec/s: 69 rss: 69Mb L: 64/105 MS: 1 ShuffleBytes- 00:08:19.711 [2024-11-28 07:33:30.245959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.711 [2024-11-28 07:33:30.245990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.711 [2024-11-28 07:33:30.246111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.711 [2024-11-28 07:33:30.246137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.711 #70 NEW cov: 11894 ft: 15150 corp: 41/2766b lim: 105 exec/s: 35 rss: 69Mb L: 62/105 MS: 1 ChangeByte- 00:08:19.711 #70 DONE cov: 11894 ft: 15150 corp: 41/2766b lim: 105 exec/s: 35 rss: 69Mb 00:08:19.711 ###### Recommended dictionary. ###### 00:08:19.711 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:19.711 "\000\000\000\000" # Uses: 0 00:08:19.711 ###### End of recommended dictionary. ###### 00:08:19.711 Done 70 runs in 2 second(s) 00:08:19.711 07:33:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:19.711 07:33:30 -- ../common.sh@72 -- # (( i++ )) 00:08:19.711 07:33:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.711 07:33:30 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:19.711 07:33:30 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:19.711 07:33:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.711 07:33:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.711 07:33:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:19.711 07:33:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:19.711 07:33:30 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:19.711 07:33:30 -- nvmf/run.sh@29 -- # port=4417 00:08:19.711 07:33:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:19.711 07:33:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:19.711 07:33:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.711 07:33:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:19.711 [2024-11-28 07:33:30.431962] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:19.711 [2024-11-28 07:33:30.432055] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661591 ] 00:08:19.711 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.970 [2024-11-28 07:33:30.614608] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.970 [2024-11-28 07:33:30.634837] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:19.970 [2024-11-28 07:33:30.634956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.970 [2024-11-28 07:33:30.686467] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.970 [2024-11-28 07:33:30.702784] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:19.970 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.970 INFO: Seed: 3228878377 00:08:19.970 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:19.970 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:19.970 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:19.970 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.970 #2 INITED exec/s: 0 rss: 60Mb 00:08:19.970 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.970 This may also happen if the target rejected all inputs we tried so far 00:08:20.228 [2024-11-28 07:33:30.747579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069431492607 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.228 [2024-11-28 07:33:30.747623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.228 [2024-11-28 07:33:30.747658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.228 [2024-11-28 07:33:30.747676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.228 [2024-11-28 07:33:30.747707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.228 [2024-11-28 07:33:30.747723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.228 [2024-11-28 07:33:30.747751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.228 [2024-11-28 07:33:30.747766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.488 NEW_FUNC[1/672]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:20.488 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:20.488 #7 NEW cov: 11688 ft: 11689 corp: 2/102b lim: 120 exec/s: 0 rss: 67Mb L: 101/101 MS: 5 ChangeBit-ChangeBinInt-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:20.488 [2024-11-28 07:33:31.068092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.068132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.488 #14 NEW cov: 11801 ft: 12909 corp: 3/142b lim: 120 exec/s: 0 rss: 67Mb L: 40/101 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:20.488 [2024-11-28 07:33:31.128321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.128352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.488 [2024-11-28 07:33:31.128400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:34048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.128417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.488 [2024-11-28 07:33:31.128446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.128462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.488 [2024-11-28 07:33:31.128490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.128506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.488 #15 NEW cov: 11807 ft: 13285 corp: 4/245b lim: 120 exec/s: 0 rss: 67Mb L: 103/103 MS: 1 CrossOver- 00:08:20.488 [2024-11-28 07:33:31.188294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120711 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.188323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.488 #21 NEW cov: 11892 ft: 13590 corp: 5/286b lim: 120 exec/s: 0 rss: 67Mb L: 41/103 MS: 1 InsertByte- 00:08:20.488 [2024-11-28 07:33:31.248660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.248689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.488 [2024-11-28 07:33:31.248736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:34048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.248754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.488 [2024-11-28 07:33:31.248783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.248800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.488 [2024-11-28 07:33:31.248827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9549038584909004799 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.488 [2024-11-28 07:33:31.248843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.747 #27 NEW cov: 11892 ft: 13656 corp: 6/389b lim: 120 exec/s: 0 rss: 67Mb L: 103/103 MS: 1 CopyPart- 00:08:20.747 [2024-11-28 07:33:31.308604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.308633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.747 #30 NEW cov: 11892 ft: 13843 corp: 7/421b lim: 120 exec/s: 0 rss: 67Mb L: 32/103 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:20.747 [2024-11-28 07:33:31.358913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.358943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.358973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.358990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.359018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.359033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.359059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.359074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.747 #31 NEW cov: 11892 ft: 13911 corp: 8/527b lim: 120 exec/s: 0 rss: 67Mb L: 106/106 MS: 1 InsertRepeatedBytes- 00:08:20.747 [2024-11-28 07:33:31.409042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.409072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.409119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814627169412 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.409136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.409165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.409181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.409209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.409225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.747 #32 NEW cov: 11892 ft: 14005 corp: 9/633b lim: 120 exec/s: 0 rss: 68Mb L: 106/106 MS: 1 ChangeBit- 00:08:20.747 [2024-11-28 07:33:31.469095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.469124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.747 [2024-11-28 07:33:31.469172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902812402843780 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.747 [2024-11-28 07:33:31.469190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.006 #33 NEW cov: 11892 ft: 14415 corp: 10/694b lim: 120 exec/s: 0 rss: 68Mb L: 61/106 MS: 1 CrossOver- 00:08:21.006 [2024-11-28 07:33:31.539219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.539248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.006 #34 NEW cov: 11892 ft: 14530 corp: 11/726b lim: 120 exec/s: 0 rss: 68Mb L: 32/106 MS: 1 ChangeBit- 00:08:21.006 [2024-11-28 07:33:31.599577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.599613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.006 [2024-11-28 07:33:31.599662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814627169412 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.599680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.006 [2024-11-28 07:33:31.599709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.599725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.006 [2024-11-28 07:33:31.599752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.599768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.006 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.006 #35 NEW cov: 11909 ft: 14567 corp: 12/832b lim: 120 exec/s: 0 rss: 68Mb L: 106/106 MS: 1 CMP- DE: "\000\002\000\000"- 00:08:21.006 [2024-11-28 07:33:31.659581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120711 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.659619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.006 #36 NEW cov: 11909 ft: 14601 corp: 13/873b lim: 120 exec/s: 0 rss: 68Mb L: 41/106 MS: 1 ChangeByte- 00:08:21.006 [2024-11-28 07:33:31.729748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.006 [2024-11-28 07:33:31.729779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.006 #42 NEW cov: 11909 ft: 14621 corp: 14/914b lim: 120 exec/s: 42 rss: 68Mb L: 41/106 MS: 1 InsertByte- 00:08:21.265 [2024-11-28 07:33:31.790403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120711 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.790430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 #43 NEW cov: 11909 ft: 14807 corp: 15/955b lim: 120 exec/s: 43 rss: 68Mb L: 41/106 MS: 1 ShuffleBytes- 00:08:21.265 [2024-11-28 07:33:31.830525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.830552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 #44 NEW cov: 11909 ft: 14828 corp: 16/996b lim: 120 exec/s: 44 rss: 68Mb L: 41/106 MS: 1 ChangeBinInt- 00:08:21.265 [2024-11-28 07:33:31.870658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:541703557838308484 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.870685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 #45 NEW cov: 11909 ft: 14881 corp: 17/1038b lim: 120 exec/s: 45 rss: 68Mb L: 42/106 MS: 1 CrossOver- 00:08:21.265 [2024-11-28 07:33:31.910912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.910942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 [2024-11-28 07:33:31.910995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902812402843780 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.911010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.265 #46 NEW cov: 11909 ft: 14901 corp: 18/1099b lim: 120 exec/s: 46 rss: 68Mb L: 61/106 MS: 1 CopyPart- 00:08:21.265 [2024-11-28 07:33:31.950913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902812478637063 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.950939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 #47 NEW cov: 11909 ft: 14915 corp: 19/1140b lim: 120 exec/s: 47 rss: 68Mb L: 41/106 MS: 1 ChangeBit- 00:08:21.265 [2024-11-28 07:33:31.990990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:31.991017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 #48 NEW cov: 11909 ft: 14944 corp: 20/1181b lim: 120 exec/s: 48 rss: 68Mb L: 41/106 MS: 1 ChangeBinInt- 00:08:21.265 [2024-11-28 07:33:32.031575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:32.031606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.265 [2024-11-28 07:33:32.031655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:34048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:32.031670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.265 [2024-11-28 07:33:32.031718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:32.031732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.265 [2024-11-28 07:33:32.031758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9549038584909004799 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.265 [2024-11-28 07:33:32.031773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.525 #49 NEW cov: 11909 ft: 14976 corp: 21/1284b lim: 120 exec/s: 49 rss: 68Mb L: 103/106 MS: 1 ChangeBinInt- 00:08:21.525 [2024-11-28 07:33:32.071679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.525 [2024-11-28 07:33:32.071707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.525 [2024-11-28 07:33:32.071744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.071759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.071806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.071821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.071866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744070543066111 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.071881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.526 #50 NEW cov: 11909 ft: 14990 corp: 22/1400b lim: 120 exec/s: 50 rss: 68Mb L: 116/116 MS: 1 InsertRepeatedBytes- 00:08:21.526 [2024-11-28 07:33:32.111905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.111932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.111979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.111994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.112040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.112055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.112106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4846792386510061635 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.112120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.112168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.112183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.526 #51 NEW cov: 11909 ft: 15076 corp: 23/1520b lim: 120 exec/s: 51 rss: 68Mb L: 120/120 MS: 1 PersAutoDict- DE: "\000\002\000\000"- 00:08:21.526 [2024-11-28 07:33:32.151908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.151934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.151997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.152013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.152061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12370262272184942592 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.152076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.152124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.152137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.526 #52 NEW cov: 11909 ft: 15080 corp: 24/1626b lim: 120 exec/s: 52 rss: 68Mb L: 106/120 MS: 1 CrossOver- 00:08:21.526 [2024-11-28 07:33:32.191715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:232 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.191741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.191784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902812402843648 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.191799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 #58 NEW cov: 11909 ft: 15097 corp: 25/1688b lim: 120 exec/s: 58 rss: 68Mb L: 62/120 MS: 1 InsertByte- 00:08:21.526 [2024-11-28 07:33:32.232225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.232251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.232303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.232317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.232364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.232378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.232426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4846792386510061635 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.232440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.526 [2024-11-28 07:33:32.232480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.232496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.526 #59 NEW cov: 11909 ft: 15107 corp: 26/1808b lim: 120 exec/s: 59 rss: 68Mb L: 120/120 MS: 1 ChangeBit- 00:08:21.526 [2024-11-28 07:33:32.271836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120711 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.526 [2024-11-28 07:33:32.271862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.526 #60 NEW cov: 11909 ft: 15111 corp: 27/1854b lim: 120 exec/s: 60 rss: 68Mb L: 46/120 MS: 1 CrossOver- 00:08:21.814 [2024-11-28 07:33:32.312487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548891776560170116 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.312514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.814 [2024-11-28 07:33:32.312585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8825501129363457156 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.312603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.814 [2024-11-28 07:33:32.312652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.312669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.814 [2024-11-28 07:33:32.312718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.312736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.814 [2024-11-28 07:33:32.312783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:4899916390288540483 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.312798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.814 #61 NEW cov: 11909 ft: 15180 corp: 28/1974b lim: 120 exec/s: 61 rss: 68Mb L: 120/120 MS: 1 CopyPart- 00:08:21.814 [2024-11-28 07:33:32.352081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:150995456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.352107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.814 #66 NEW cov: 11909 ft: 15258 corp: 29/2007b lim: 120 exec/s: 66 rss: 68Mb L: 33/120 MS: 5 ChangeBinInt-ChangeBit-InsertByte-PersAutoDict-CrossOver- DE: "\000\002\000\000"- 00:08:21.814 [2024-11-28 07:33:32.392144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902812478637063 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.392171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.814 #67 NEW cov: 11909 ft: 15284 corp: 30/2048b lim: 120 exec/s: 67 rss: 68Mb L: 41/120 MS: 1 ShuffleBytes- 00:08:21.814 [2024-11-28 07:33:32.432317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120711 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.814 [2024-11-28 07:33:32.432343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.815 #68 NEW cov: 11909 ft: 15305 corp: 31/2089b lim: 120 exec/s: 68 rss: 68Mb L: 41/120 MS: 1 ChangeByte- 00:08:21.815 [2024-11-28 07:33:32.472386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120711 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.815 [2024-11-28 07:33:32.472412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.815 #69 NEW cov: 11909 ft: 15311 corp: 32/2130b lim: 120 exec/s: 69 rss: 68Mb L: 41/120 MS: 1 ShuffleBytes- 00:08:21.815 [2024-11-28 07:33:32.512489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.815 [2024-11-28 07:33:32.512515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.815 #70 NEW cov: 11909 ft: 15325 corp: 33/2162b lim: 120 exec/s: 70 rss: 68Mb L: 32/120 MS: 1 CrossOver- 00:08:21.815 [2024-11-28 07:33:32.553088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.815 [2024-11-28 07:33:32.553115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.815 [2024-11-28 07:33:32.553162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.815 [2024-11-28 07:33:32.553177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.815 [2024-11-28 07:33:32.553225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.815 [2024-11-28 07:33:32.553239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.815 [2024-11-28 07:33:32.553288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.815 [2024-11-28 07:33:32.553303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.815 #71 NEW cov: 11909 ft: 15364 corp: 34/2274b lim: 120 exec/s: 71 rss: 68Mb L: 112/120 MS: 1 InsertRepeatedBytes- 00:08:22.114 [2024-11-28 07:33:32.593166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.114 [2024-11-28 07:33:32.593194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.114 [2024-11-28 07:33:32.593241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:34048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.114 [2024-11-28 07:33:32.593257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.114 [2024-11-28 07:33:32.593304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.114 [2024-11-28 07:33:32.593319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.114 [2024-11-28 07:33:32.593366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.114 [2024-11-28 07:33:32.593382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.114 #72 NEW cov: 11909 ft: 15378 corp: 35/2377b lim: 120 exec/s: 72 rss: 68Mb L: 103/120 MS: 1 ChangeBit- 00:08:22.114 [2024-11-28 07:33:32.633319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069431492607 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.114 [2024-11-28 07:33:32.633345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.114 [2024-11-28 07:33:32.633377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.114 [2024-11-28 07:33:32.633392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.115 [2024-11-28 07:33:32.633441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.633456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.115 [2024-11-28 07:33:32.633506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.633520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.115 #73 NEW cov: 11916 ft: 15434 corp: 36/2478b lim: 120 exec/s: 73 rss: 68Mb L: 101/120 MS: 1 ChangeBinInt- 00:08:22.115 [2024-11-28 07:33:32.673397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:145704693792770 len:31355 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.673423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 [2024-11-28 07:33:32.673470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9548902814626120836 len:33925 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.673485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.115 [2024-11-28 07:33:32.673532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.673550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.115 [2024-11-28 07:33:32.673603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.673616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.115 #74 NEW cov: 11916 ft: 15491 corp: 37/2588b lim: 120 exec/s: 74 rss: 69Mb L: 110/120 MS: 1 PersAutoDict- DE: "\000\002\000\000"- 00:08:22.115 [2024-11-28 07:33:32.713088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.115 [2024-11-28 07:33:32.713114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.115 #75 NEW cov: 11916 ft: 15503 corp: 38/2624b lim: 120 exec/s: 37 rss: 69Mb L: 36/120 MS: 1 PersAutoDict- DE: "\000\002\000\000"- 00:08:22.115 #75 DONE cov: 11916 ft: 15503 corp: 38/2624b lim: 120 exec/s: 37 rss: 69Mb 00:08:22.115 ###### Recommended dictionary. ###### 00:08:22.115 "\000\002\000\000" # Uses: 4 00:08:22.115 ###### End of recommended dictionary. ###### 00:08:22.115 Done 75 runs in 2 second(s) 00:08:22.115 07:33:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:22.115 07:33:32 -- ../common.sh@72 -- # (( i++ )) 00:08:22.115 07:33:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.115 07:33:32 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:22.115 07:33:32 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:22.115 07:33:32 -- nvmf/run.sh@24 -- # local timen=1 00:08:22.115 07:33:32 -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.115 07:33:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:22.115 07:33:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:22.115 07:33:32 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:22.115 07:33:32 -- nvmf/run.sh@29 -- # port=4418 00:08:22.115 07:33:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:22.115 07:33:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:22.115 07:33:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.115 07:33:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:22.115 [2024-11-28 07:33:32.878379] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:22.115 [2024-11-28 07:33:32.878437] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661643 ] 00:08:22.372 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.372 [2024-11-28 07:33:33.130175] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.630 [2024-11-28 07:33:33.159312] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.630 [2024-11-28 07:33:33.159434] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.630 [2024-11-28 07:33:33.210921] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:22.630 [2024-11-28 07:33:33.227288] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:22.630 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.630 INFO: Seed: 1456893971 00:08:22.630 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:22.630 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:22.630 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:22.630 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.630 #2 INITED exec/s: 0 rss: 59Mb 00:08:22.630 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.630 This may also happen if the target rejected all inputs we tried so far 00:08:22.630 [2024-11-28 07:33:33.272503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.630 [2024-11-28 07:33:33.272531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.630 [2024-11-28 07:33:33.272565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.630 [2024-11-28 07:33:33.272578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.630 [2024-11-28 07:33:33.272641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.630 [2024-11-28 07:33:33.272656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.888 NEW_FUNC[1/670]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:22.888 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.888 #13 NEW cov: 11632 ft: 11600 corp: 2/61b lim: 100 exec/s: 0 rss: 67Mb L: 60/60 MS: 1 InsertRepeatedBytes- 00:08:22.888 [2024-11-28 07:33:33.583383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.888 [2024-11-28 07:33:33.583415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.888 [2024-11-28 07:33:33.583459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.888 [2024-11-28 07:33:33.583474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.888 [2024-11-28 07:33:33.583523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.888 [2024-11-28 07:33:33.583537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.888 [2024-11-28 07:33:33.583589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:22.888 [2024-11-28 07:33:33.583609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.888 #14 NEW cov: 11745 ft: 12257 corp: 3/151b lim: 100 exec/s: 0 rss: 67Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:22.888 [2024-11-28 07:33:33.623396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:22.888 [2024-11-28 07:33:33.623421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.888 [2024-11-28 07:33:33.623465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:22.888 [2024-11-28 07:33:33.623480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.888 [2024-11-28 07:33:33.623527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:22.888 [2024-11-28 07:33:33.623541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.888 [2024-11-28 07:33:33.623591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:22.888 [2024-11-28 07:33:33.623609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.888 #15 NEW cov: 11751 ft: 12499 corp: 4/241b lim: 100 exec/s: 0 rss: 67Mb L: 90/90 MS: 1 CMP- DE: "\001\030"- 00:08:23.146 [2024-11-28 07:33:33.663361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.146 [2024-11-28 07:33:33.663388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.663424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.146 [2024-11-28 07:33:33.663439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.663491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.146 [2024-11-28 07:33:33.663506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.146 #21 NEW cov: 11836 ft: 12752 corp: 5/301b lim: 100 exec/s: 0 rss: 67Mb L: 60/90 MS: 1 ChangeByte- 00:08:23.146 [2024-11-28 07:33:33.703768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.146 [2024-11-28 07:33:33.703795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.703866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.146 [2024-11-28 07:33:33.703879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.703930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.146 [2024-11-28 07:33:33.703944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.703994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.146 [2024-11-28 07:33:33.704009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.704059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:23.146 [2024-11-28 07:33:33.704073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.146 #26 NEW cov: 11836 ft: 12876 corp: 6/401b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 5 ShuffleBytes-InsertByte-ShuffleBytes-EraseBytes-InsertRepeatedBytes- 00:08:23.146 [2024-11-28 07:33:33.743743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.146 [2024-11-28 07:33:33.743769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.743818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.146 [2024-11-28 07:33:33.743833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.743885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.146 [2024-11-28 07:33:33.743898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.743949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.146 [2024-11-28 07:33:33.743963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.146 #27 NEW cov: 11836 ft: 12964 corp: 7/491b lim: 100 exec/s: 0 rss: 67Mb L: 90/100 MS: 1 ChangeByte- 00:08:23.146 [2024-11-28 07:33:33.783717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.146 [2024-11-28 07:33:33.783743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.783781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.146 [2024-11-28 07:33:33.783795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.146 [2024-11-28 07:33:33.783846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.147 [2024-11-28 07:33:33.783860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.147 #28 NEW cov: 11836 ft: 13083 corp: 8/551b lim: 100 exec/s: 0 rss: 67Mb L: 60/100 MS: 1 CopyPart- 00:08:23.147 [2024-11-28 07:33:33.823876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.147 [2024-11-28 07:33:33.823902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.823937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.147 [2024-11-28 07:33:33.823952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.824008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.147 [2024-11-28 07:33:33.824021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.147 #29 NEW cov: 11836 ft: 13136 corp: 9/611b lim: 100 exec/s: 0 rss: 67Mb L: 60/100 MS: 1 PersAutoDict- DE: "\001\030"- 00:08:23.147 [2024-11-28 07:33:33.864059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.147 [2024-11-28 07:33:33.864085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.864132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.147 [2024-11-28 07:33:33.864147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.864194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.147 [2024-11-28 07:33:33.864205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.864256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.147 [2024-11-28 07:33:33.864270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.147 #30 NEW cov: 11836 ft: 13191 corp: 10/701b lim: 100 exec/s: 0 rss: 67Mb L: 90/100 MS: 1 ChangeByte- 00:08:23.147 [2024-11-28 07:33:33.904197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.147 [2024-11-28 07:33:33.904223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.904267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.147 [2024-11-28 07:33:33.904281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.904331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.147 [2024-11-28 07:33:33.904345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.147 [2024-11-28 07:33:33.904375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.147 [2024-11-28 07:33:33.904389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.404 #31 NEW cov: 11836 ft: 13392 corp: 11/792b lim: 100 exec/s: 0 rss: 67Mb L: 91/100 MS: 1 InsertByte- 00:08:23.404 [2024-11-28 07:33:33.944191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.404 [2024-11-28 07:33:33.944217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:33.944256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.404 [2024-11-28 07:33:33.944270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:33.944319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.404 [2024-11-28 07:33:33.944333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.404 #32 NEW cov: 11836 ft: 13430 corp: 12/864b lim: 100 exec/s: 0 rss: 68Mb L: 72/100 MS: 1 EraseBytes- 00:08:23.404 [2024-11-28 07:33:33.984537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.404 [2024-11-28 07:33:33.984564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:33.984617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.404 [2024-11-28 07:33:33.984632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:33.984680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.404 [2024-11-28 07:33:33.984694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:33.984743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.404 [2024-11-28 07:33:33.984757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:33.984797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:23.404 [2024-11-28 07:33:33.984811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.404 #33 NEW cov: 11836 ft: 13520 corp: 13/964b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:23.404 [2024-11-28 07:33:34.024535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.404 [2024-11-28 07:33:34.024560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.024604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.404 [2024-11-28 07:33:34.024618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.024666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.404 [2024-11-28 07:33:34.024678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.024731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.404 [2024-11-28 07:33:34.024745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.404 #34 NEW cov: 11836 ft: 13539 corp: 14/1054b lim: 100 exec/s: 0 rss: 68Mb L: 90/100 MS: 1 ChangeBit- 00:08:23.404 [2024-11-28 07:33:34.064630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.404 [2024-11-28 07:33:34.064656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.064716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.404 [2024-11-28 07:33:34.064730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.064790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.404 [2024-11-28 07:33:34.064805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.064853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.404 [2024-11-28 07:33:34.064867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.404 #35 NEW cov: 11836 ft: 13565 corp: 15/1146b lim: 100 exec/s: 0 rss: 68Mb L: 92/100 MS: 1 PersAutoDict- DE: "\001\030"- 00:08:23.404 [2024-11-28 07:33:34.104822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.404 [2024-11-28 07:33:34.104848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.104913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.404 [2024-11-28 07:33:34.104927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.104976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.404 [2024-11-28 07:33:34.104990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.105040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.404 [2024-11-28 07:33:34.105053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.404 #36 NEW cov: 11836 ft: 13613 corp: 16/1239b lim: 100 exec/s: 0 rss: 68Mb L: 93/100 MS: 1 InsertRepeatedBytes- 00:08:23.404 [2024-11-28 07:33:34.144932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.404 [2024-11-28 07:33:34.144957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.145004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.404 [2024-11-28 07:33:34.145017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.145068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.404 [2024-11-28 07:33:34.145082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.404 [2024-11-28 07:33:34.145133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.404 [2024-11-28 07:33:34.145148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.404 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.405 #37 NEW cov: 11859 ft: 13627 corp: 17/1338b lim: 100 exec/s: 0 rss: 69Mb L: 99/100 MS: 1 CrossOver- 00:08:23.663 [2024-11-28 07:33:34.184906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.184934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.184983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.185002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.185052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.185066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.663 #38 NEW cov: 11859 ft: 13643 corp: 18/1398b lim: 100 exec/s: 0 rss: 69Mb L: 60/100 MS: 1 ChangeByte- 00:08:23.663 [2024-11-28 07:33:34.224984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.225009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.225053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.225068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.225120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.225133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.663 #39 NEW cov: 11859 ft: 13655 corp: 19/1477b lim: 100 exec/s: 0 rss: 69Mb L: 79/100 MS: 1 CopyPart- 00:08:23.663 [2024-11-28 07:33:34.265076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.265101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.265147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.265161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.265213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.265227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.663 #40 NEW cov: 11859 ft: 13681 corp: 20/1537b lim: 100 exec/s: 40 rss: 69Mb L: 60/100 MS: 1 ShuffleBytes- 00:08:23.663 [2024-11-28 07:33:34.305478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.305504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.305568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.305583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.305663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.305677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.305730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.663 [2024-11-28 07:33:34.305744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.305798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:23.663 [2024-11-28 07:33:34.305812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.663 #41 NEW cov: 11859 ft: 13751 corp: 21/1637b lim: 100 exec/s: 41 rss: 69Mb L: 100/100 MS: 1 CrossOver- 00:08:23.663 [2024-11-28 07:33:34.345416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.345444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.345482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.345496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.345546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.345560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.345610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.663 [2024-11-28 07:33:34.345639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.663 #42 NEW cov: 11859 ft: 13775 corp: 22/1728b lim: 100 exec/s: 42 rss: 69Mb L: 91/100 MS: 1 InsertByte- 00:08:23.663 [2024-11-28 07:33:34.385568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.385594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.385654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.385669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.385720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.385733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.385784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.663 [2024-11-28 07:33:34.385798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.663 #43 NEW cov: 11859 ft: 13816 corp: 23/1820b lim: 100 exec/s: 43 rss: 69Mb L: 92/100 MS: 1 PersAutoDict- DE: "\001\030"- 00:08:23.663 [2024-11-28 07:33:34.425556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.663 [2024-11-28 07:33:34.425581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.425620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.663 [2024-11-28 07:33:34.425651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.663 [2024-11-28 07:33:34.425702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.663 [2024-11-28 07:33:34.425715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.921 #44 NEW cov: 11859 ft: 13849 corp: 24/1880b lim: 100 exec/s: 44 rss: 69Mb L: 60/100 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:23.921 [2024-11-28 07:33:34.465587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.921 [2024-11-28 07:33:34.465617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.921 [2024-11-28 07:33:34.465670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.921 [2024-11-28 07:33:34.465684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.921 #48 NEW cov: 11859 ft: 14190 corp: 25/1927b lim: 100 exec/s: 48 rss: 69Mb L: 47/100 MS: 4 CrossOver-ChangeBit-InsertRepeatedBytes-CrossOver- 00:08:23.921 [2024-11-28 07:33:34.505849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.921 [2024-11-28 07:33:34.505874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.921 [2024-11-28 07:33:34.505918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.921 [2024-11-28 07:33:34.505933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.921 [2024-11-28 07:33:34.505984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.921 [2024-11-28 07:33:34.505998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.921 #49 NEW cov: 11859 ft: 14195 corp: 26/1999b lim: 100 exec/s: 49 rss: 69Mb L: 72/100 MS: 1 ChangeBinInt- 00:08:23.921 [2024-11-28 07:33:34.546063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.921 [2024-11-28 07:33:34.546088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.921 [2024-11-28 07:33:34.546150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.921 [2024-11-28 07:33:34.546165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.921 [2024-11-28 07:33:34.546215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.921 [2024-11-28 07:33:34.546230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.921 [2024-11-28 07:33:34.546277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.921 [2024-11-28 07:33:34.546290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.921 #50 NEW cov: 11859 ft: 14220 corp: 27/2090b lim: 100 exec/s: 50 rss: 69Mb L: 91/100 MS: 1 CrossOver- 00:08:23.922 [2024-11-28 07:33:34.586069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.922 [2024-11-28 07:33:34.586094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.586136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.922 [2024-11-28 07:33:34.586150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.586200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.922 [2024-11-28 07:33:34.586213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.922 #51 NEW cov: 11859 ft: 14262 corp: 28/2150b lim: 100 exec/s: 51 rss: 69Mb L: 60/100 MS: 1 ChangeBit- 00:08:23.922 [2024-11-28 07:33:34.626372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.922 [2024-11-28 07:33:34.626397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.626452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.922 [2024-11-28 07:33:34.626466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.626517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:23.922 [2024-11-28 07:33:34.626532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.626585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:23.922 [2024-11-28 07:33:34.626602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.626620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:23.922 [2024-11-28 07:33:34.626646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.922 #52 NEW cov: 11859 ft: 14271 corp: 29/2250b lim: 100 exec/s: 52 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:08:23.922 [2024-11-28 07:33:34.666175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.922 [2024-11-28 07:33:34.666201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.922 [2024-11-28 07:33:34.666237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.922 [2024-11-28 07:33:34.666251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.922 #56 NEW cov: 11859 ft: 14287 corp: 30/2305b lim: 100 exec/s: 56 rss: 69Mb L: 55/100 MS: 4 InsertRepeatedBytes-InsertByte-CMP-InsertRepeatedBytes- DE: "\001\000"- 00:08:24.179 [2024-11-28 07:33:34.706505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.179 [2024-11-28 07:33:34.706531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.706570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.179 [2024-11-28 07:33:34.706584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.706652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.179 [2024-11-28 07:33:34.706666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.706716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.179 [2024-11-28 07:33:34.706730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.179 #57 NEW cov: 11859 ft: 14294 corp: 31/2397b lim: 100 exec/s: 57 rss: 69Mb L: 92/100 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:24.179 [2024-11-28 07:33:34.746637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.179 [2024-11-28 07:33:34.746662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.746729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.179 [2024-11-28 07:33:34.746744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.746794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.179 [2024-11-28 07:33:34.746808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.746858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.179 [2024-11-28 07:33:34.746872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.179 #58 NEW cov: 11859 ft: 14307 corp: 32/2487b lim: 100 exec/s: 58 rss: 69Mb L: 90/100 MS: 1 ChangeByte- 00:08:24.179 [2024-11-28 07:33:34.786761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.179 [2024-11-28 07:33:34.786790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.786843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.179 [2024-11-28 07:33:34.786858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.786910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.179 [2024-11-28 07:33:34.786924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.786973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.179 [2024-11-28 07:33:34.786988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.179 #59 NEW cov: 11859 ft: 14343 corp: 33/2586b lim: 100 exec/s: 59 rss: 70Mb L: 99/100 MS: 1 ChangeByte- 00:08:24.179 [2024-11-28 07:33:34.826890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.179 [2024-11-28 07:33:34.826915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.826979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.179 [2024-11-28 07:33:34.826994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.179 [2024-11-28 07:33:34.827045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.180 [2024-11-28 07:33:34.827059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.827109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.180 [2024-11-28 07:33:34.827123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.180 #60 NEW cov: 11859 ft: 14355 corp: 34/2678b lim: 100 exec/s: 60 rss: 70Mb L: 92/100 MS: 1 InsertByte- 00:08:24.180 [2024-11-28 07:33:34.866779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.180 [2024-11-28 07:33:34.866805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.866854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.180 [2024-11-28 07:33:34.866868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.180 #61 NEW cov: 11859 ft: 14372 corp: 35/2733b lim: 100 exec/s: 61 rss: 70Mb L: 55/100 MS: 1 ChangeBit- 00:08:24.180 [2024-11-28 07:33:34.907122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.180 [2024-11-28 07:33:34.907147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.907190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.180 [2024-11-28 07:33:34.907205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.907256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.180 [2024-11-28 07:33:34.907270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.907319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.180 [2024-11-28 07:33:34.907335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.180 #62 NEW cov: 11859 ft: 14385 corp: 36/2832b lim: 100 exec/s: 62 rss: 70Mb L: 99/100 MS: 1 ChangeBit- 00:08:24.180 [2024-11-28 07:33:34.947286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.180 [2024-11-28 07:33:34.947318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.947360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.180 [2024-11-28 07:33:34.947377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.947431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.180 [2024-11-28 07:33:34.947447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.180 [2024-11-28 07:33:34.947501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.180 [2024-11-28 07:33:34.947520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.438 #63 NEW cov: 11859 ft: 14406 corp: 37/2922b lim: 100 exec/s: 63 rss: 70Mb L: 90/100 MS: 1 CMP- DE: "\377\014"- 00:08:24.438 [2024-11-28 07:33:34.987227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.438 [2024-11-28 07:33:34.987253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:34.987319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.438 [2024-11-28 07:33:34.987333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:34.987386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.438 [2024-11-28 07:33:34.987400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.438 #64 NEW cov: 11859 ft: 14475 corp: 38/2982b lim: 100 exec/s: 64 rss: 70Mb L: 60/100 MS: 1 ChangeBinInt- 00:08:24.438 [2024-11-28 07:33:35.027451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.438 [2024-11-28 07:33:35.027476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.027524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.438 [2024-11-28 07:33:35.027538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.027587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.438 [2024-11-28 07:33:35.027607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.027662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.438 [2024-11-28 07:33:35.027676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.067592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.438 [2024-11-28 07:33:35.067621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.067675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.438 [2024-11-28 07:33:35.067689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.067741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.438 [2024-11-28 07:33:35.067755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.067806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.438 [2024-11-28 07:33:35.067820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.438 #66 NEW cov: 11859 ft: 14491 corp: 39/3081b lim: 100 exec/s: 66 rss: 70Mb L: 99/100 MS: 2 ShuffleBytes-ChangeBinInt- 00:08:24.438 [2024-11-28 07:33:35.107783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.438 [2024-11-28 07:33:35.107810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.107860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.438 [2024-11-28 07:33:35.107872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.107923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.438 [2024-11-28 07:33:35.107936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.107986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.438 [2024-11-28 07:33:35.107999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.438 [2024-11-28 07:33:35.108051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:24.438 [2024-11-28 07:33:35.108065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.438 #67 NEW cov: 11859 ft: 14509 corp: 40/3181b lim: 100 exec/s: 67 rss: 70Mb L: 100/100 MS: 1 ChangeByte- 00:08:24.438 [2024-11-28 07:33:35.147459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.438 [2024-11-28 07:33:35.147486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.438 #68 NEW cov: 11859 ft: 14900 corp: 41/3202b lim: 100 exec/s: 68 rss: 70Mb L: 21/100 MS: 1 CrossOver- 00:08:24.438 [2024-11-28 07:33:35.187577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.438 [2024-11-28 07:33:35.187606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.696 #69 NEW cov: 11859 ft: 14957 corp: 42/3241b lim: 100 exec/s: 69 rss: 70Mb L: 39/100 MS: 1 EraseBytes- 00:08:24.696 [2024-11-28 07:33:35.227993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.696 [2024-11-28 07:33:35.228020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.696 [2024-11-28 07:33:35.228063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.696 [2024-11-28 07:33:35.228077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.696 [2024-11-28 07:33:35.228129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.696 [2024-11-28 07:33:35.228143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.696 [2024-11-28 07:33:35.228194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.696 [2024-11-28 07:33:35.228212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.696 #70 NEW cov: 11859 ft: 14961 corp: 43/3331b lim: 100 exec/s: 70 rss: 70Mb L: 90/100 MS: 1 ChangeBit- 00:08:24.696 [2024-11-28 07:33:35.267998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.696 [2024-11-28 07:33:35.268023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.696 [2024-11-28 07:33:35.268073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.696 [2024-11-28 07:33:35.268087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.696 [2024-11-28 07:33:35.268141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.696 [2024-11-28 07:33:35.268154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.696 #71 NEW cov: 11859 ft: 14988 corp: 44/3405b lim: 100 exec/s: 35 rss: 70Mb L: 74/100 MS: 1 PersAutoDict- DE: "\377\014"- 00:08:24.696 #71 DONE cov: 11859 ft: 14988 corp: 44/3405b lim: 100 exec/s: 35 rss: 70Mb 00:08:24.696 ###### Recommended dictionary. ###### 00:08:24.696 "\001\030" # Uses: 5 00:08:24.696 "\001\000\000\000" # Uses: 0 00:08:24.696 "\001\000" # Uses: 1 00:08:24.696 "\377\014" # Uses: 1 00:08:24.696 ###### End of recommended dictionary. ###### 00:08:24.696 Done 71 runs in 2 second(s) 00:08:24.696 07:33:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:24.696 07:33:35 -- ../common.sh@72 -- # (( i++ )) 00:08:24.696 07:33:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.696 07:33:35 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:24.696 07:33:35 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:24.696 07:33:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:24.696 07:33:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:24.696 07:33:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:24.696 07:33:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:24.696 07:33:35 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:24.696 07:33:35 -- nvmf/run.sh@29 -- # port=4419 00:08:24.696 07:33:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:24.696 07:33:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:24.696 07:33:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:24.696 07:33:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:24.696 [2024-11-28 07:33:35.451342] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:24.696 [2024-11-28 07:33:35.451437] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661694 ] 00:08:24.954 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.954 [2024-11-28 07:33:35.629120] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.954 [2024-11-28 07:33:35.648418] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:24.954 [2024-11-28 07:33:35.648550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.954 [2024-11-28 07:33:35.699750] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.954 [2024-11-28 07:33:35.716135] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:25.211 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.211 INFO: Seed: 3946918091 00:08:25.211 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:25.211 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:25.211 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:25.211 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.211 #2 INITED exec/s: 0 rss: 59Mb 00:08:25.211 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.211 This may also happen if the target rejected all inputs we tried so far 00:08:25.211 [2024-11-28 07:33:35.760609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:65536 00:08:25.211 [2024-11-28 07:33:35.760642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.469 NEW_FUNC[1/670]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:25.469 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.469 #5 NEW cov: 11608 ft: 11609 corp: 2/16b lim: 50 exec/s: 0 rss: 67Mb L: 15/15 MS: 3 CopyPart-CMP-InsertRepeatedBytes- DE: "\000\000"- 00:08:25.469 [2024-11-28 07:33:36.091404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:65536 00:08:25.469 [2024-11-28 07:33:36.091442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.469 #6 NEW cov: 11723 ft: 12051 corp: 3/32b lim: 50 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 InsertByte- 00:08:25.469 [2024-11-28 07:33:36.161489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:1 00:08:25.469 [2024-11-28 07:33:36.161518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.469 #7 NEW cov: 11729 ft: 12400 corp: 4/48b lim: 50 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:25.469 [2024-11-28 07:33:36.221616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446623105955725311 len:61968 00:08:25.469 [2024-11-28 07:33:36.221647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.728 #11 NEW cov: 11814 ft: 12702 corp: 5/62b lim: 50 exec/s: 0 rss: 67Mb L: 14/16 MS: 4 EraseBytes-ChangeByte-EraseBytes-CMP- DE: "\377\221\373\362\017\357\300t"- 00:08:25.728 [2024-11-28 07:33:36.271783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2233664447421939711 len:61968 00:08:25.728 [2024-11-28 07:33:36.271815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.728 #12 NEW cov: 11814 ft: 12767 corp: 6/76b lim: 50 exec/s: 0 rss: 67Mb L: 14/16 MS: 1 CMP- DE: "\377\036"- 00:08:25.728 [2024-11-28 07:33:36.332038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15191436296650412754 len:53971 00:08:25.728 [2024-11-28 07:33:36.332070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.728 [2024-11-28 07:33:36.332100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 00:08:25.728 [2024-11-28 07:33:36.332118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.728 [2024-11-28 07:33:36.332146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:25.728 [2024-11-28 07:33:36.332162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.728 [2024-11-28 07:33:36.332194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15191436295996101330 len:53971 00:08:25.728 [2024-11-28 07:33:36.332210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.728 #14 NEW cov: 11814 ft: 13196 corp: 7/123b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:25.728 [2024-11-28 07:33:36.392069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446623105955725311 len:61968 00:08:25.728 [2024-11-28 07:33:36.392100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.728 #15 NEW cov: 11814 ft: 13247 corp: 8/137b lim: 50 exec/s: 0 rss: 67Mb L: 14/47 MS: 1 CrossOver- 00:08:25.728 [2024-11-28 07:33:36.442296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15191436296650412754 len:53971 00:08:25.728 [2024-11-28 07:33:36.442326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.728 [2024-11-28 07:33:36.442371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 00:08:25.728 [2024-11-28 07:33:36.442388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.728 [2024-11-28 07:33:36.442417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:25.728 [2024-11-28 07:33:36.442433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.728 [2024-11-28 07:33:36.442460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15191436295996101330 len:53971 00:08:25.728 [2024-11-28 07:33:36.442476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.728 #16 NEW cov: 11814 ft: 13309 corp: 9/184b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 ChangeBinInt- 00:08:25.987 [2024-11-28 07:33:36.512360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:1 00:08:25.987 [2024-11-28 07:33:36.512389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.987 #17 NEW cov: 11814 ft: 13366 corp: 10/199b lim: 50 exec/s: 0 rss: 67Mb L: 15/47 MS: 1 EraseBytes- 00:08:25.987 [2024-11-28 07:33:36.572521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:65536 00:08:25.987 [2024-11-28 07:33:36.572550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.987 #23 NEW cov: 11814 ft: 13388 corp: 11/212b lim: 50 exec/s: 0 rss: 67Mb L: 13/47 MS: 1 EraseBytes- 00:08:25.987 [2024-11-28 07:33:36.622789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:3585 00:08:25.987 [2024-11-28 07:33:36.622818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.987 [2024-11-28 07:33:36.622865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:65370 00:08:25.987 [2024-11-28 07:33:36.622883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.987 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.987 #24 NEW cov: 11831 ft: 13659 corp: 12/236b lim: 50 exec/s: 0 rss: 68Mb L: 24/47 MS: 1 CMP- DE: "\016\000\000\000\000\000\000\000"- 00:08:25.987 [2024-11-28 07:33:36.682838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:1 00:08:25.987 [2024-11-28 07:33:36.682872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.987 #25 NEW cov: 11831 ft: 13698 corp: 13/251b lim: 50 exec/s: 0 rss: 68Mb L: 15/47 MS: 1 ShuffleBytes- 00:08:25.987 [2024-11-28 07:33:36.743160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15191436296650412754 len:53971 00:08:25.987 [2024-11-28 07:33:36.743189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.987 [2024-11-28 07:33:36.743235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 00:08:25.987 [2024-11-28 07:33:36.743252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.987 [2024-11-28 07:33:36.743280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:25.987 [2024-11-28 07:33:36.743296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.987 [2024-11-28 07:33:36.743323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15191436295996101330 len:53971 00:08:25.987 [2024-11-28 07:33:36.743339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.245 #26 NEW cov: 11831 ft: 13796 corp: 14/298b lim: 50 exec/s: 26 rss: 68Mb L: 47/47 MS: 1 ChangeASCIIInt- 00:08:26.245 [2024-11-28 07:33:36.813195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:1 00:08:26.245 [2024-11-28 07:33:36.813224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.245 #27 NEW cov: 11831 ft: 13814 corp: 15/313b lim: 50 exec/s: 27 rss: 68Mb L: 15/47 MS: 1 ShuffleBytes- 00:08:26.245 [2024-11-28 07:33:36.863296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:65536 00:08:26.245 [2024-11-28 07:33:36.863325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.245 #28 NEW cov: 11831 ft: 13837 corp: 16/330b lim: 50 exec/s: 28 rss: 68Mb L: 17/47 MS: 1 InsertByte- 00:08:26.245 [2024-11-28 07:33:36.913536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15191436296650412754 len:53971 00:08:26.245 [2024-11-28 07:33:36.913564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.245 [2024-11-28 07:33:36.913620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53957 00:08:26.245 [2024-11-28 07:33:36.913638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.245 [2024-11-28 07:33:36.913666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:26.245 [2024-11-28 07:33:36.913682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.245 [2024-11-28 07:33:36.913708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15191436295996101330 len:53971 00:08:26.245 [2024-11-28 07:33:36.913724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.245 #29 NEW cov: 11831 ft: 13894 corp: 17/377b lim: 50 exec/s: 29 rss: 68Mb L: 47/47 MS: 1 ChangeByte- 00:08:26.245 [2024-11-28 07:33:36.983722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069565579263 len:65536 00:08:26.245 [2024-11-28 07:33:36.983754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.245 [2024-11-28 07:33:36.983786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65528 00:08:26.245 [2024-11-28 07:33:36.983802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.502 #33 NEW cov: 11831 ft: 13915 corp: 18/397b lim: 50 exec/s: 33 rss: 68Mb L: 20/47 MS: 4 ChangeBit-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:26.502 [2024-11-28 07:33:37.033819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:3585 00:08:26.502 [2024-11-28 07:33:37.033848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.502 [2024-11-28 07:33:37.033894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:65370 00:08:26.502 [2024-11-28 07:33:37.033911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.502 [2024-11-28 07:33:37.033940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:281470681743360 len:65291 00:08:26.502 [2024-11-28 07:33:37.033956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.502 #34 NEW cov: 11831 ft: 14151 corp: 19/427b lim: 50 exec/s: 34 rss: 68Mb L: 30/47 MS: 1 InsertRepeatedBytes- 00:08:26.502 [2024-11-28 07:33:37.103982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:65536 00:08:26.502 [2024-11-28 07:33:37.104011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.502 #35 NEW cov: 11831 ft: 14180 corp: 20/439b lim: 50 exec/s: 35 rss: 68Mb L: 12/47 MS: 1 EraseBytes- 00:08:26.502 [2024-11-28 07:33:37.164142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414595327 len:65536 00:08:26.502 [2024-11-28 07:33:37.164171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.502 #36 NEW cov: 11831 ft: 14184 corp: 21/456b lim: 50 exec/s: 36 rss: 68Mb L: 17/47 MS: 1 ChangeBit- 00:08:26.502 [2024-11-28 07:33:37.214214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17442441352511029247 len:37372 00:08:26.502 [2024-11-28 07:33:37.214243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.502 #37 NEW cov: 11831 ft: 14202 corp: 22/472b lim: 50 exec/s: 37 rss: 68Mb L: 16/47 MS: 1 CopyPart- 00:08:26.761 [2024-11-28 07:33:37.274478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069414587135 len:3585 00:08:26.761 [2024-11-28 07:33:37.274519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.274570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:65325 00:08:26.761 [2024-11-28 07:33:37.274587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.761 #38 NEW cov: 11831 ft: 14233 corp: 23/497b lim: 50 exec/s: 38 rss: 68Mb L: 25/47 MS: 1 InsertByte- 00:08:26.761 [2024-11-28 07:33:37.324616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15191436296650412754 len:53971 00:08:26.761 [2024-11-28 07:33:37.324645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.324695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 00:08:26.761 [2024-11-28 07:33:37.324714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.324743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:26.761 [2024-11-28 07:33:37.324758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.324785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15191436295996101330 len:53971 00:08:26.761 [2024-11-28 07:33:37.324801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.761 #39 NEW cov: 11831 ft: 14288 corp: 24/544b lim: 50 exec/s: 39 rss: 68Mb L: 47/47 MS: 1 ShuffleBytes- 00:08:26.761 [2024-11-28 07:33:37.374796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15132106649215619794 len:53971 00:08:26.761 [2024-11-28 07:33:37.374827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.374859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 00:08:26.761 [2024-11-28 07:33:37.374876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.374904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:26.761 [2024-11-28 07:33:37.374920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.374946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15191436295996101330 len:53971 00:08:26.761 [2024-11-28 07:33:37.374962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.761 #40 NEW cov: 11831 ft: 14301 corp: 25/593b lim: 50 exec/s: 40 rss: 68Mb L: 49/49 MS: 1 CrossOver- 00:08:26.761 [2024-11-28 07:33:37.424766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2233664447421939711 len:7952 00:08:26.761 [2024-11-28 07:33:37.424796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.761 #41 NEW cov: 11831 ft: 14345 corp: 26/607b lim: 50 exec/s: 41 rss: 68Mb L: 14/49 MS: 1 ChangeByte- 00:08:26.761 [2024-11-28 07:33:37.495042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:281470681808895 len:65536 00:08:26.761 [2024-11-28 07:33:37.495073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.761 [2024-11-28 07:33:37.495105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17275968697445118479 len:65536 00:08:26.761 [2024-11-28 07:33:37.495122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.020 #42 NEW cov: 11831 ft: 14427 corp: 27/635b lim: 50 exec/s: 42 rss: 68Mb L: 28/49 MS: 1 CopyPart- 00:08:27.020 [2024-11-28 07:33:37.545093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17506072289433681919 len:37376 00:08:27.020 [2024-11-28 07:33:37.545122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.020 #43 NEW cov: 11831 ft: 14566 corp: 28/651b lim: 50 exec/s: 43 rss: 68Mb L: 16/49 MS: 1 ShuffleBytes- 00:08:27.020 [2024-11-28 07:33:37.605379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:27.020 [2024-11-28 07:33:37.605408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.020 [2024-11-28 07:33:37.605454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:27.020 [2024-11-28 07:33:37.605471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.020 [2024-11-28 07:33:37.605498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:27.020 [2024-11-28 07:33:37.605514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.020 [2024-11-28 07:33:37.605541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:1 00:08:27.020 [2024-11-28 07:33:37.605557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.020 #47 NEW cov: 11838 ft: 14621 corp: 29/696b lim: 50 exec/s: 47 rss: 69Mb L: 45/49 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:27.021 [2024-11-28 07:33:37.665460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069683022591 len:65536 00:08:27.021 [2024-11-28 07:33:37.665490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.021 #48 NEW cov: 11838 ft: 14633 corp: 30/712b lim: 50 exec/s: 48 rss: 69Mb L: 16/49 MS: 1 ChangeBinInt- 00:08:27.021 [2024-11-28 07:33:37.715652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15191436296650412754 len:53971 00:08:27.021 [2024-11-28 07:33:37.715680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.021 [2024-11-28 07:33:37.715727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 00:08:27.021 [2024-11-28 07:33:37.715744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.021 [2024-11-28 07:33:37.715772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15191436295996101330 len:53971 00:08:27.021 [2024-11-28 07:33:37.715788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.021 [2024-11-28 07:33:37.715815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3329236352478925522 len:53971 00:08:27.021 [2024-11-28 07:33:37.715830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.021 #49 NEW cov: 11838 ft: 14645 corp: 31/753b lim: 50 exec/s: 24 rss: 69Mb L: 41/49 MS: 1 EraseBytes- 00:08:27.021 #49 DONE cov: 11838 ft: 14645 corp: 31/753b lim: 50 exec/s: 24 rss: 69Mb 00:08:27.021 ###### Recommended dictionary. ###### 00:08:27.021 "\000\000" # Uses: 1 00:08:27.021 "\377\221\373\362\017\357\300t" # Uses: 0 00:08:27.021 "\377\036" # Uses: 0 00:08:27.021 "\016\000\000\000\000\000\000\000" # Uses: 0 00:08:27.021 ###### End of recommended dictionary. ###### 00:08:27.021 Done 49 runs in 2 second(s) 00:08:27.279 07:33:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:27.279 07:33:37 -- ../common.sh@72 -- # (( i++ )) 00:08:27.279 07:33:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.279 07:33:37 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:27.279 07:33:37 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:27.279 07:33:37 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.279 07:33:37 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.279 07:33:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:27.279 07:33:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:27.279 07:33:37 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:27.279 07:33:37 -- nvmf/run.sh@29 -- # port=4420 00:08:27.279 07:33:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:27.279 07:33:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:27.279 07:33:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.279 07:33:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:27.279 [2024-11-28 07:33:37.906177] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:27.279 [2024-11-28 07:33:37.906261] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661736 ] 00:08:27.279 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.536 [2024-11-28 07:33:38.084767] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.536 [2024-11-28 07:33:38.104461] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:27.536 [2024-11-28 07:33:38.104573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.536 [2024-11-28 07:33:38.155849] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:27.537 [2024-11-28 07:33:38.172214] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:27.537 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.537 INFO: Seed: 2106953322 00:08:27.537 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:27.537 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:27.537 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:27.537 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.537 #2 INITED exec/s: 0 rss: 59Mb 00:08:27.537 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.537 This may also happen if the target rejected all inputs we tried so far 00:08:27.537 [2024-11-28 07:33:38.217713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.537 [2024-11-28 07:33:38.217743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.537 [2024-11-28 07:33:38.217781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.537 [2024-11-28 07:33:38.217796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.537 [2024-11-28 07:33:38.217848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.537 [2024-11-28 07:33:38.217864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.537 [2024-11-28 07:33:38.217919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:27.537 [2024-11-28 07:33:38.217933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.796 NEW_FUNC[1/672]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:27.796 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.796 #16 NEW cov: 11652 ft: 11653 corp: 2/87b lim: 90 exec/s: 0 rss: 67Mb L: 86/86 MS: 4 InsertByte-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:27.796 [2024-11-28 07:33:38.508345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.796 [2024-11-28 07:33:38.508377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.796 [2024-11-28 07:33:38.508439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.796 [2024-11-28 07:33:38.508454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.796 [2024-11-28 07:33:38.508508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:27.796 [2024-11-28 07:33:38.508525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.796 #31 NEW cov: 11781 ft: 12520 corp: 3/155b lim: 90 exec/s: 0 rss: 67Mb L: 68/86 MS: 5 ChangeBit-InsertByte-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:27.796 [2024-11-28 07:33:38.548168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:27.796 [2024-11-28 07:33:38.548197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.796 [2024-11-28 07:33:38.548252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:27.796 [2024-11-28 07:33:38.548268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.056 #35 NEW cov: 11787 ft: 13065 corp: 4/206b lim: 90 exec/s: 0 rss: 67Mb L: 51/86 MS: 4 InsertRepeatedBytes-EraseBytes-CMP-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:08:28.056 [2024-11-28 07:33:38.588147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.056 [2024-11-28 07:33:38.588174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.056 #36 NEW cov: 11872 ft: 14206 corp: 5/241b lim: 90 exec/s: 0 rss: 67Mb L: 35/86 MS: 1 EraseBytes- 00:08:28.056 [2024-11-28 07:33:38.638432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.056 [2024-11-28 07:33:38.638459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.056 [2024-11-28 07:33:38.638512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.056 [2024-11-28 07:33:38.638528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.056 #37 NEW cov: 11872 ft: 14348 corp: 6/277b lim: 90 exec/s: 0 rss: 67Mb L: 36/86 MS: 1 InsertByte- 00:08:28.056 [2024-11-28 07:33:38.678531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.056 [2024-11-28 07:33:38.678559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.056 [2024-11-28 07:33:38.678608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.056 [2024-11-28 07:33:38.678625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.056 #43 NEW cov: 11872 ft: 14410 corp: 7/313b lim: 90 exec/s: 0 rss: 67Mb L: 36/86 MS: 1 ChangeByte- 00:08:28.056 [2024-11-28 07:33:38.718702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.056 [2024-11-28 07:33:38.718730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.056 [2024-11-28 07:33:38.718796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.056 [2024-11-28 07:33:38.718816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.056 #44 NEW cov: 11872 ft: 14467 corp: 8/353b lim: 90 exec/s: 0 rss: 67Mb L: 40/86 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:28.056 [2024-11-28 07:33:38.758641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.056 [2024-11-28 07:33:38.758668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.056 #45 NEW cov: 11872 ft: 14539 corp: 9/387b lim: 90 exec/s: 0 rss: 67Mb L: 34/86 MS: 1 EraseBytes- 00:08:28.056 [2024-11-28 07:33:38.798891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.056 [2024-11-28 07:33:38.798918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.056 [2024-11-28 07:33:38.798962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.056 [2024-11-28 07:33:38.798977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.056 #46 NEW cov: 11872 ft: 14578 corp: 10/424b lim: 90 exec/s: 0 rss: 67Mb L: 37/86 MS: 1 CrossOver- 00:08:28.315 [2024-11-28 07:33:38.839334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.315 [2024-11-28 07:33:38.839362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.315 [2024-11-28 07:33:38.839411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.315 [2024-11-28 07:33:38.839426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.315 [2024-11-28 07:33:38.839480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.315 [2024-11-28 07:33:38.839494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.315 [2024-11-28 07:33:38.839547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.315 [2024-11-28 07:33:38.839563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.315 #47 NEW cov: 11872 ft: 14680 corp: 11/510b lim: 90 exec/s: 0 rss: 67Mb L: 86/86 MS: 1 CrossOver- 00:08:28.315 [2024-11-28 07:33:38.879160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.316 [2024-11-28 07:33:38.879188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.316 [2024-11-28 07:33:38.879228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.316 [2024-11-28 07:33:38.879244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.316 #48 NEW cov: 11872 ft: 14698 corp: 12/547b lim: 90 exec/s: 0 rss: 68Mb L: 37/86 MS: 1 ChangeBit- 00:08:28.316 [2024-11-28 07:33:38.919226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.316 [2024-11-28 07:33:38.919253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.316 [2024-11-28 07:33:38.919293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.316 [2024-11-28 07:33:38.919309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.316 #49 NEW cov: 11872 ft: 14722 corp: 13/598b lim: 90 exec/s: 0 rss: 68Mb L: 51/86 MS: 1 CrossOver- 00:08:28.316 [2024-11-28 07:33:38.959357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.316 [2024-11-28 07:33:38.959389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.316 [2024-11-28 07:33:38.959439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.316 [2024-11-28 07:33:38.959455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.316 #50 NEW cov: 11872 ft: 14737 corp: 14/634b lim: 90 exec/s: 0 rss: 68Mb L: 36/86 MS: 1 CrossOver- 00:08:28.316 [2024-11-28 07:33:38.999475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.316 [2024-11-28 07:33:38.999501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.316 [2024-11-28 07:33:38.999553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.316 [2024-11-28 07:33:38.999569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.316 #51 NEW cov: 11872 ft: 14767 corp: 15/674b lim: 90 exec/s: 0 rss: 68Mb L: 40/86 MS: 1 ChangeBit- 00:08:28.316 [2024-11-28 07:33:39.039734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.316 [2024-11-28 07:33:39.039761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.316 [2024-11-28 07:33:39.039807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.316 [2024-11-28 07:33:39.039823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.316 [2024-11-28 07:33:39.039875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.316 [2024-11-28 07:33:39.039890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.316 #52 NEW cov: 11872 ft: 14776 corp: 16/732b lim: 90 exec/s: 0 rss: 68Mb L: 58/86 MS: 1 InsertRepeatedBytes- 00:08:28.316 [2024-11-28 07:33:39.079562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.316 [2024-11-28 07:33:39.079589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.575 #53 NEW cov: 11895 ft: 14809 corp: 17/766b lim: 90 exec/s: 0 rss: 68Mb L: 34/86 MS: 1 ChangeByte- 00:08:28.575 [2024-11-28 07:33:39.129672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.575 [2024-11-28 07:33:39.129698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 #54 NEW cov: 11895 ft: 14860 corp: 18/800b lim: 90 exec/s: 0 rss: 68Mb L: 34/86 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:28.575 [2024-11-28 07:33:39.169767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.575 [2024-11-28 07:33:39.169794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 #55 NEW cov: 11895 ft: 14879 corp: 19/834b lim: 90 exec/s: 0 rss: 68Mb L: 34/86 MS: 1 ChangeBit- 00:08:28.575 [2024-11-28 07:33:39.210232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.575 [2024-11-28 07:33:39.210259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 [2024-11-28 07:33:39.210315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.575 [2024-11-28 07:33:39.210335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.575 [2024-11-28 07:33:39.210390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.575 [2024-11-28 07:33:39.210405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.575 #56 NEW cov: 11895 ft: 14893 corp: 20/901b lim: 90 exec/s: 56 rss: 68Mb L: 67/86 MS: 1 CopyPart- 00:08:28.575 [2024-11-28 07:33:39.250042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.575 [2024-11-28 07:33:39.250069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 #57 NEW cov: 11895 ft: 14912 corp: 21/935b lim: 90 exec/s: 57 rss: 69Mb L: 34/86 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:28.575 [2024-11-28 07:33:39.290295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.575 [2024-11-28 07:33:39.290324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 [2024-11-28 07:33:39.290376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.575 [2024-11-28 07:33:39.290392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.575 #58 NEW cov: 11895 ft: 14952 corp: 22/975b lim: 90 exec/s: 58 rss: 69Mb L: 40/86 MS: 1 CMP- DE: "\000\000\000\007"- 00:08:28.575 [2024-11-28 07:33:39.330434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.575 [2024-11-28 07:33:39.330463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.575 [2024-11-28 07:33:39.330532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.575 [2024-11-28 07:33:39.330548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.835 #59 NEW cov: 11895 ft: 14970 corp: 23/1011b lim: 90 exec/s: 59 rss: 69Mb L: 36/86 MS: 1 ChangeBit- 00:08:28.835 [2024-11-28 07:33:39.370890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.835 [2024-11-28 07:33:39.370919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.370984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.835 [2024-11-28 07:33:39.371000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.371052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.835 [2024-11-28 07:33:39.371068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.371122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.835 [2024-11-28 07:33:39.371137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.835 #60 NEW cov: 11895 ft: 15021 corp: 24/1099b lim: 90 exec/s: 60 rss: 69Mb L: 88/88 MS: 1 CopyPart- 00:08:28.835 [2024-11-28 07:33:39.410536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.835 [2024-11-28 07:33:39.410563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.835 #61 NEW cov: 11895 ft: 15041 corp: 25/1133b lim: 90 exec/s: 61 rss: 69Mb L: 34/88 MS: 1 ChangeBit- 00:08:28.835 [2024-11-28 07:33:39.450623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.835 [2024-11-28 07:33:39.450670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.835 #62 NEW cov: 11895 ft: 15061 corp: 26/1161b lim: 90 exec/s: 62 rss: 69Mb L: 28/88 MS: 1 EraseBytes- 00:08:28.835 [2024-11-28 07:33:39.500920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.835 [2024-11-28 07:33:39.500948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.500983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.835 [2024-11-28 07:33:39.501000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.835 #63 NEW cov: 11895 ft: 15104 corp: 27/1198b lim: 90 exec/s: 63 rss: 69Mb L: 37/88 MS: 1 CopyPart- 00:08:28.835 [2024-11-28 07:33:39.541056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.835 [2024-11-28 07:33:39.541082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.541125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.835 [2024-11-28 07:33:39.541141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.835 #64 NEW cov: 11895 ft: 15118 corp: 28/1249b lim: 90 exec/s: 64 rss: 69Mb L: 51/88 MS: 1 ShuffleBytes- 00:08:28.835 [2024-11-28 07:33:39.581625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.835 [2024-11-28 07:33:39.581653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.581695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.835 [2024-11-28 07:33:39.581708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.581765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.835 [2024-11-28 07:33:39.581780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.581832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.835 [2024-11-28 07:33:39.581846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.835 [2024-11-28 07:33:39.581902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:28.835 [2024-11-28 07:33:39.581917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.836 #65 NEW cov: 11895 ft: 15153 corp: 29/1339b lim: 90 exec/s: 65 rss: 69Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:29.095 [2024-11-28 07:33:39.621612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.095 [2024-11-28 07:33:39.621641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.621680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.095 [2024-11-28 07:33:39.621696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.621750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.095 [2024-11-28 07:33:39.621765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.621823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.095 [2024-11-28 07:33:39.621839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.095 #66 NEW cov: 11895 ft: 15155 corp: 30/1428b lim: 90 exec/s: 66 rss: 69Mb L: 89/90 MS: 1 InsertRepeatedBytes- 00:08:29.095 [2024-11-28 07:33:39.661667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.095 [2024-11-28 07:33:39.661694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.661759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.095 [2024-11-28 07:33:39.661776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.661832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.095 [2024-11-28 07:33:39.661847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.661902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.095 [2024-11-28 07:33:39.661916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.095 #67 NEW cov: 11895 ft: 15171 corp: 31/1516b lim: 90 exec/s: 67 rss: 69Mb L: 88/90 MS: 1 ShuffleBytes- 00:08:29.095 [2024-11-28 07:33:39.701801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.095 [2024-11-28 07:33:39.701829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.701895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.095 [2024-11-28 07:33:39.701911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.701966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.095 [2024-11-28 07:33:39.701982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.702040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.095 [2024-11-28 07:33:39.702056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.095 #68 NEW cov: 11895 ft: 15190 corp: 32/1602b lim: 90 exec/s: 68 rss: 69Mb L: 86/90 MS: 1 ShuffleBytes- 00:08:29.095 [2024-11-28 07:33:39.741449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.095 [2024-11-28 07:33:39.741477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.095 #70 NEW cov: 11895 ft: 15299 corp: 33/1627b lim: 90 exec/s: 70 rss: 69Mb L: 25/90 MS: 2 ShuffleBytes-CrossOver- 00:08:29.095 [2024-11-28 07:33:39.781854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.095 [2024-11-28 07:33:39.781882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.781926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.095 [2024-11-28 07:33:39.781942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.782000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.095 [2024-11-28 07:33:39.782014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.095 #71 NEW cov: 11895 ft: 15309 corp: 34/1695b lim: 90 exec/s: 71 rss: 69Mb L: 68/90 MS: 1 CopyPart- 00:08:29.095 [2024-11-28 07:33:39.821988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.095 [2024-11-28 07:33:39.822014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.095 [2024-11-28 07:33:39.822052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.095 [2024-11-28 07:33:39.822066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.096 [2024-11-28 07:33:39.822118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.096 [2024-11-28 07:33:39.822133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.096 #72 NEW cov: 11895 ft: 15314 corp: 35/1763b lim: 90 exec/s: 72 rss: 69Mb L: 68/90 MS: 1 ShuffleBytes- 00:08:29.096 [2024-11-28 07:33:39.861814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.096 [2024-11-28 07:33:39.861841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.355 #73 NEW cov: 11895 ft: 15329 corp: 36/1782b lim: 90 exec/s: 73 rss: 69Mb L: 19/90 MS: 1 CrossOver- 00:08:29.355 [2024-11-28 07:33:39.902550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.355 [2024-11-28 07:33:39.902577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.902633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.355 [2024-11-28 07:33:39.902648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.902699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.355 [2024-11-28 07:33:39.902715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.902765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.355 [2024-11-28 07:33:39.902780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.902832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:29.355 [2024-11-28 07:33:39.902846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.355 #74 NEW cov: 11895 ft: 15350 corp: 37/1872b lim: 90 exec/s: 74 rss: 69Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:29.355 [2024-11-28 07:33:39.942503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.355 [2024-11-28 07:33:39.942530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.942581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.355 [2024-11-28 07:33:39.942596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.942654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.355 [2024-11-28 07:33:39.942670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.355 [2024-11-28 07:33:39.942724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.355 [2024-11-28 07:33:39.942739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.355 #75 NEW cov: 11895 ft: 15355 corp: 38/1961b lim: 90 exec/s: 75 rss: 69Mb L: 89/90 MS: 1 InsertRepeatedBytes- 00:08:29.356 [2024-11-28 07:33:39.982438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.356 [2024-11-28 07:33:39.982465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.356 [2024-11-28 07:33:39.982531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.356 [2024-11-28 07:33:39.982547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.356 [2024-11-28 07:33:39.982605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.356 [2024-11-28 07:33:39.982620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.356 #76 NEW cov: 11895 ft: 15372 corp: 39/2029b lim: 90 exec/s: 76 rss: 69Mb L: 68/90 MS: 1 PersAutoDict- DE: "\000\000\000\007"- 00:08:29.356 [2024-11-28 07:33:40.022440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.356 [2024-11-28 07:33:40.022468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.356 [2024-11-28 07:33:40.022510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.356 [2024-11-28 07:33:40.022526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.356 #77 NEW cov: 11895 ft: 15387 corp: 40/2073b lim: 90 exec/s: 77 rss: 69Mb L: 44/90 MS: 1 PersAutoDict- DE: "\000\000\000\007"- 00:08:29.356 [2024-11-28 07:33:40.072408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.356 [2024-11-28 07:33:40.072437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.356 #78 NEW cov: 11895 ft: 15392 corp: 41/2107b lim: 90 exec/s: 78 rss: 69Mb L: 34/90 MS: 1 ChangeBinInt- 00:08:29.356 [2024-11-28 07:33:40.112846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.356 [2024-11-28 07:33:40.112876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.356 [2024-11-28 07:33:40.112915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.356 [2024-11-28 07:33:40.112931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.356 [2024-11-28 07:33:40.112983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.356 [2024-11-28 07:33:40.112999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.615 #79 NEW cov: 11895 ft: 15419 corp: 42/2175b lim: 90 exec/s: 79 rss: 70Mb L: 68/90 MS: 1 ChangeBinInt- 00:08:29.615 [2024-11-28 07:33:40.152872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.615 [2024-11-28 07:33:40.152899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.615 [2024-11-28 07:33:40.152958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.615 [2024-11-28 07:33:40.152977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.615 #80 NEW cov: 11895 ft: 15432 corp: 43/2212b lim: 90 exec/s: 80 rss: 70Mb L: 37/90 MS: 1 CrossOver- 00:08:29.615 [2024-11-28 07:33:40.193435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.616 [2024-11-28 07:33:40.193462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.616 [2024-11-28 07:33:40.193518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.616 [2024-11-28 07:33:40.193534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.616 [2024-11-28 07:33:40.193589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.616 [2024-11-28 07:33:40.193608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.616 [2024-11-28 07:33:40.193660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.616 [2024-11-28 07:33:40.193679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.616 [2024-11-28 07:33:40.193720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:29.616 [2024-11-28 07:33:40.193736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.616 #81 NEW cov: 11895 ft: 15436 corp: 44/2302b lim: 90 exec/s: 40 rss: 70Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:29.616 #81 DONE cov: 11895 ft: 15436 corp: 44/2302b lim: 90 exec/s: 40 rss: 70Mb 00:08:29.616 ###### Recommended dictionary. ###### 00:08:29.616 "\377\377\377\377" # Uses: 3 00:08:29.616 "\000\000\000\007" # Uses: 2 00:08:29.616 ###### End of recommended dictionary. ###### 00:08:29.616 Done 81 runs in 2 second(s) 00:08:29.616 07:33:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:29.616 07:33:40 -- ../common.sh@72 -- # (( i++ )) 00:08:29.616 07:33:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.616 07:33:40 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:29.616 07:33:40 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:29.616 07:33:40 -- nvmf/run.sh@24 -- # local timen=1 00:08:29.616 07:33:40 -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.616 07:33:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:29.616 07:33:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:29.616 07:33:40 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:29.616 07:33:40 -- nvmf/run.sh@29 -- # port=4421 00:08:29.616 07:33:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:29.616 07:33:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:29.616 07:33:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.616 07:33:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:29.616 [2024-11-28 07:33:40.367525] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:29.616 [2024-11-28 07:33:40.367631] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661778 ] 00:08:29.875 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.875 [2024-11-28 07:33:40.552842] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.875 [2024-11-28 07:33:40.572919] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:29.875 [2024-11-28 07:33:40.573037] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.875 [2024-11-28 07:33:40.624630] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.875 [2024-11-28 07:33:40.640998] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:30.134 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.134 INFO: Seed: 279991310 00:08:30.134 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:30.134 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:30.134 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:30.134 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.134 #2 INITED exec/s: 0 rss: 60Mb 00:08:30.134 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.134 This may also happen if the target rejected all inputs we tried so far 00:08:30.134 [2024-11-28 07:33:40.690034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.134 [2024-11-28 07:33:40.690065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.134 [2024-11-28 07:33:40.690118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.134 [2024-11-28 07:33:40.690135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.393 NEW_FUNC[1/672]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:30.393 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:30.393 #3 NEW cov: 11643 ft: 11644 corp: 2/24b lim: 50 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:30.393 [2024-11-28 07:33:41.010826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.393 [2024-11-28 07:33:41.010861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.393 [2024-11-28 07:33:41.010928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.393 [2024-11-28 07:33:41.010944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.393 #4 NEW cov: 11756 ft: 12101 corp: 3/47b lim: 50 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 ChangeBit- 00:08:30.393 [2024-11-28 07:33:41.060859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.393 [2024-11-28 07:33:41.060886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.393 [2024-11-28 07:33:41.060940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.393 [2024-11-28 07:33:41.060956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.393 #5 NEW cov: 11762 ft: 12294 corp: 4/74b lim: 50 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:30.393 [2024-11-28 07:33:41.101108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.393 [2024-11-28 07:33:41.101136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.394 [2024-11-28 07:33:41.101172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.394 [2024-11-28 07:33:41.101186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.394 [2024-11-28 07:33:41.101234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.394 [2024-11-28 07:33:41.101252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.394 #6 NEW cov: 11847 ft: 12967 corp: 5/105b lim: 50 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:30.394 [2024-11-28 07:33:41.151247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.394 [2024-11-28 07:33:41.151273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.394 [2024-11-28 07:33:41.151307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.394 [2024-11-28 07:33:41.151323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.394 [2024-11-28 07:33:41.151375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.394 [2024-11-28 07:33:41.151390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.654 #7 NEW cov: 11847 ft: 13051 corp: 6/136b lim: 50 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 ShuffleBytes- 00:08:30.654 [2024-11-28 07:33:41.191199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.654 [2024-11-28 07:33:41.191224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.191291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.654 [2024-11-28 07:33:41.191306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.654 #11 NEW cov: 11847 ft: 13196 corp: 7/164b lim: 50 exec/s: 0 rss: 68Mb L: 28/31 MS: 4 InsertByte-InsertByte-PersAutoDict-CrossOver- DE: "\377\377\377\377"- 00:08:30.654 [2024-11-28 07:33:41.231309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.654 [2024-11-28 07:33:41.231336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.231384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.654 [2024-11-28 07:33:41.231399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.654 #12 NEW cov: 11847 ft: 13326 corp: 8/187b lim: 50 exec/s: 0 rss: 68Mb L: 23/31 MS: 1 CMP- DE: "\001\000\377\377"- 00:08:30.654 [2024-11-28 07:33:41.271382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.654 [2024-11-28 07:33:41.271409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.271447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.654 [2024-11-28 07:33:41.271461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.654 #13 NEW cov: 11847 ft: 13399 corp: 9/210b lim: 50 exec/s: 0 rss: 68Mb L: 23/31 MS: 1 ChangeBit- 00:08:30.654 [2024-11-28 07:33:41.311550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.654 [2024-11-28 07:33:41.311577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.311619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.654 [2024-11-28 07:33:41.311651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.654 #14 NEW cov: 11847 ft: 13442 corp: 10/233b lim: 50 exec/s: 0 rss: 68Mb L: 23/31 MS: 1 ShuffleBytes- 00:08:30.654 [2024-11-28 07:33:41.351536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.654 [2024-11-28 07:33:41.351564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.654 #15 NEW cov: 11847 ft: 14253 corp: 11/246b lim: 50 exec/s: 0 rss: 68Mb L: 13/31 MS: 1 EraseBytes- 00:08:30.654 [2024-11-28 07:33:41.392054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.654 [2024-11-28 07:33:41.392081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.392118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.654 [2024-11-28 07:33:41.392133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.392185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.654 [2024-11-28 07:33:41.392200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.654 [2024-11-28 07:33:41.392248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.654 [2024-11-28 07:33:41.392262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.654 #16 NEW cov: 11847 ft: 14592 corp: 12/291b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 CopyPart- 00:08:30.914 [2024-11-28 07:33:41.432097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.432125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.432158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.432174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.432224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.914 [2024-11-28 07:33:41.432239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.914 #17 NEW cov: 11847 ft: 14652 corp: 13/322b lim: 50 exec/s: 0 rss: 68Mb L: 31/45 MS: 1 CopyPart- 00:08:30.914 [2024-11-28 07:33:41.471965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.471992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.472033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.472048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 #18 NEW cov: 11847 ft: 14666 corp: 14/346b lim: 50 exec/s: 0 rss: 68Mb L: 24/45 MS: 1 InsertByte- 00:08:30.914 [2024-11-28 07:33:41.512397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.512423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.512483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.512499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.512548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.914 [2024-11-28 07:33:41.512567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.512621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.914 [2024-11-28 07:33:41.512636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.914 #19 NEW cov: 11847 ft: 14683 corp: 15/391b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 ChangeByte- 00:08:30.914 [2024-11-28 07:33:41.552234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.552261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.552299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.552314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:30.914 #20 NEW cov: 11870 ft: 14732 corp: 16/419b lim: 50 exec/s: 0 rss: 68Mb L: 28/45 MS: 1 ChangeBit- 00:08:30.914 [2024-11-28 07:33:41.592337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.592364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.592404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.592419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 #21 NEW cov: 11870 ft: 14775 corp: 17/442b lim: 50 exec/s: 0 rss: 68Mb L: 23/45 MS: 1 ShuffleBytes- 00:08:30.914 [2024-11-28 07:33:41.632433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.632459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.632493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.632508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 #22 NEW cov: 11870 ft: 14788 corp: 18/465b lim: 50 exec/s: 0 rss: 68Mb L: 23/45 MS: 1 ChangeByte- 00:08:30.914 [2024-11-28 07:33:41.662688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.914 [2024-11-28 07:33:41.662716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.662756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.914 [2024-11-28 07:33:41.662771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.914 [2024-11-28 07:33:41.662820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.914 [2024-11-28 07:33:41.662836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.173 #23 NEW cov: 11870 ft: 14827 corp: 19/497b lim: 50 exec/s: 23 rss: 68Mb L: 32/45 MS: 1 InsertByte- 00:08:31.173 [2024-11-28 07:33:41.702687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.173 [2024-11-28 07:33:41.702714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.702785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.173 [2024-11-28 07:33:41.702801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.173 #24 NEW cov: 11870 ft: 14836 corp: 20/525b lim: 50 exec/s: 24 rss: 68Mb L: 28/45 MS: 1 ShuffleBytes- 00:08:31.173 [2024-11-28 07:33:41.732771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.173 [2024-11-28 07:33:41.732797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.732845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.173 [2024-11-28 07:33:41.732861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.173 #25 NEW cov: 11870 ft: 14863 corp: 21/553b lim: 50 exec/s: 25 rss: 68Mb L: 28/45 MS: 1 ChangeByte- 00:08:31.173 [2024-11-28 07:33:41.773029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.173 [2024-11-28 07:33:41.773055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.773092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.173 [2024-11-28 07:33:41.773108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.773156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.173 [2024-11-28 07:33:41.773171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.173 #26 NEW cov: 11870 ft: 14870 corp: 22/585b lim: 50 exec/s: 26 rss: 69Mb L: 32/45 MS: 1 InsertByte- 00:08:31.173 [2024-11-28 07:33:41.813242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.173 [2024-11-28 07:33:41.813268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.813331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.173 [2024-11-28 07:33:41.813347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.813398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.173 [2024-11-28 07:33:41.813414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.173 [2024-11-28 07:33:41.813466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.173 [2024-11-28 07:33:41.813481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.173 #27 NEW cov: 11870 ft: 14881 corp: 23/633b lim: 50 exec/s: 27 rss: 69Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:31.174 [2024-11-28 07:33:41.853278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.174 [2024-11-28 07:33:41.853305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.174 [2024-11-28 07:33:41.853343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.174 [2024-11-28 07:33:41.853359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.174 [2024-11-28 07:33:41.853409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.174 [2024-11-28 07:33:41.853426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.174 #28 NEW cov: 11870 ft: 14906 corp: 24/664b lim: 50 exec/s: 28 rss: 69Mb L: 31/48 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.174 [2024-11-28 07:33:41.893201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.174 [2024-11-28 07:33:41.893228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.174 [2024-11-28 07:33:41.893269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.174 [2024-11-28 07:33:41.893284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.174 #29 NEW cov: 11870 ft: 14910 corp: 25/687b lim: 50 exec/s: 29 rss: 69Mb L: 23/48 MS: 1 ChangeBit- 00:08:31.174 [2024-11-28 07:33:41.923489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.174 [2024-11-28 07:33:41.923515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.174 [2024-11-28 07:33:41.923574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.174 [2024-11-28 07:33:41.923590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 #30 NEW cov: 11870 ft: 14923 corp: 26/714b lim: 50 exec/s: 30 rss: 69Mb L: 27/48 MS: 1 ChangeBinInt- 00:08:31.434 [2024-11-28 07:33:41.963703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.434 [2024-11-28 07:33:41.963729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:41.963791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.434 [2024-11-28 07:33:41.963806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:41.963858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.434 [2024-11-28 07:33:41.963874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:41.963926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.434 [2024-11-28 07:33:41.963941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.434 #31 NEW cov: 11870 ft: 14940 corp: 27/763b lim: 50 exec/s: 31 rss: 69Mb L: 49/49 MS: 1 CopyPart- 00:08:31.434 [2024-11-28 07:33:42.003806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.434 [2024-11-28 07:33:42.003834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.003870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.434 [2024-11-28 07:33:42.003885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.003934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.434 [2024-11-28 07:33:42.003948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.004000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.434 [2024-11-28 07:33:42.004015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.434 #32 NEW cov: 11870 ft: 14969 corp: 28/811b lim: 50 exec/s: 32 rss: 69Mb L: 48/49 MS: 1 ShuffleBytes- 00:08:31.434 [2024-11-28 07:33:42.043690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.434 [2024-11-28 07:33:42.043716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.043755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.434 [2024-11-28 07:33:42.043771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 #33 NEW cov: 11870 ft: 14987 corp: 29/834b lim: 50 exec/s: 33 rss: 69Mb L: 23/49 MS: 1 CopyPart- 00:08:31.434 [2024-11-28 07:33:42.083792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.434 [2024-11-28 07:33:42.083819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.083875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.434 [2024-11-28 07:33:42.083892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 #34 NEW cov: 11870 ft: 14992 corp: 30/858b lim: 50 exec/s: 34 rss: 69Mb L: 24/49 MS: 1 EraseBytes- 00:08:31.434 [2024-11-28 07:33:42.123906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.434 [2024-11-28 07:33:42.123932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.123971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.434 [2024-11-28 07:33:42.123986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 #40 NEW cov: 11870 ft: 15001 corp: 31/881b lim: 50 exec/s: 40 rss: 69Mb L: 23/49 MS: 1 EraseBytes- 00:08:31.434 [2024-11-28 07:33:42.163998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.434 [2024-11-28 07:33:42.164024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.434 [2024-11-28 07:33:42.164064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.434 [2024-11-28 07:33:42.164096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.434 #41 NEW cov: 11870 ft: 15029 corp: 32/906b lim: 50 exec/s: 41 rss: 69Mb L: 25/49 MS: 1 InsertByte- 00:08:31.694 [2024-11-28 07:33:42.204123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.204151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.204188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.694 [2024-11-28 07:33:42.204204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.694 #42 NEW cov: 11870 ft: 15047 corp: 33/930b lim: 50 exec/s: 42 rss: 69Mb L: 24/49 MS: 1 CopyPart- 00:08:31.694 [2024-11-28 07:33:42.244390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.244416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.244452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.694 [2024-11-28 07:33:42.244466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.244520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.694 [2024-11-28 07:33:42.244535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.694 #43 NEW cov: 11870 ft: 15063 corp: 34/962b lim: 50 exec/s: 43 rss: 69Mb L: 32/49 MS: 1 ChangeBit- 00:08:31.694 [2024-11-28 07:33:42.284456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.284482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.284517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.694 [2024-11-28 07:33:42.284532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.284583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.694 [2024-11-28 07:33:42.284602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.694 #44 NEW cov: 11870 ft: 15068 corp: 35/994b lim: 50 exec/s: 44 rss: 69Mb L: 32/49 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:31.694 [2024-11-28 07:33:42.324622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.324649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.324688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.694 [2024-11-28 07:33:42.324703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.324753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.694 [2024-11-28 07:33:42.324768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.694 #45 NEW cov: 11870 ft: 15076 corp: 36/1025b lim: 50 exec/s: 45 rss: 69Mb L: 31/49 MS: 1 ShuffleBytes- 00:08:31.694 [2024-11-28 07:33:42.364693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.364720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.364774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.694 [2024-11-28 07:33:42.364791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.364839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.694 [2024-11-28 07:33:42.364854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.694 #46 NEW cov: 11870 ft: 15108 corp: 37/1060b lim: 50 exec/s: 46 rss: 69Mb L: 35/49 MS: 1 InsertRepeatedBytes- 00:08:31.694 [2024-11-28 07:33:42.404939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.404966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.405003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.694 [2024-11-28 07:33:42.405018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.405072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.694 [2024-11-28 07:33:42.405085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.694 [2024-11-28 07:33:42.405135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.694 [2024-11-28 07:33:42.405150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.694 #47 NEW cov: 11870 ft: 15125 corp: 38/1100b lim: 50 exec/s: 47 rss: 69Mb L: 40/49 MS: 1 InsertRepeatedBytes- 00:08:31.694 [2024-11-28 07:33:42.444658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.694 [2024-11-28 07:33:42.444686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 #48 NEW cov: 11870 ft: 15193 corp: 39/1113b lim: 50 exec/s: 48 rss: 70Mb L: 13/49 MS: 1 ShuffleBytes- 00:08:31.954 [2024-11-28 07:33:42.484803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.954 [2024-11-28 07:33:42.484829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 #49 NEW cov: 11870 ft: 15197 corp: 40/1129b lim: 50 exec/s: 49 rss: 70Mb L: 16/49 MS: 1 EraseBytes- 00:08:31.954 [2024-11-28 07:33:42.525037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.954 [2024-11-28 07:33:42.525063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.525099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.954 [2024-11-28 07:33:42.525114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.954 #50 NEW cov: 11870 ft: 15218 corp: 41/1156b lim: 50 exec/s: 50 rss: 70Mb L: 27/49 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:31.954 [2024-11-28 07:33:42.555131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.954 [2024-11-28 07:33:42.555157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.555207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.954 [2024-11-28 07:33:42.555221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.954 #51 NEW cov: 11870 ft: 15277 corp: 42/1179b lim: 50 exec/s: 51 rss: 70Mb L: 23/49 MS: 1 ChangeBit- 00:08:31.954 [2024-11-28 07:33:42.585208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.954 [2024-11-28 07:33:42.585234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.585271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.954 [2024-11-28 07:33:42.585287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.954 #52 NEW cov: 11870 ft: 15334 corp: 43/1202b lim: 50 exec/s: 52 rss: 70Mb L: 23/49 MS: 1 ChangeBit- 00:08:31.954 [2024-11-28 07:33:42.625593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.954 [2024-11-28 07:33:42.625623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.625673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.954 [2024-11-28 07:33:42.625687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.625739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.954 [2024-11-28 07:33:42.625754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.625805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.954 [2024-11-28 07:33:42.625820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.954 #53 NEW cov: 11870 ft: 15341 corp: 44/1250b lim: 50 exec/s: 53 rss: 70Mb L: 48/49 MS: 1 CrossOver- 00:08:31.954 [2024-11-28 07:33:42.665435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.954 [2024-11-28 07:33:42.665463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.954 [2024-11-28 07:33:42.665500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.954 [2024-11-28 07:33:42.665530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.954 #54 NEW cov: 11870 ft: 15355 corp: 45/1273b lim: 50 exec/s: 27 rss: 70Mb L: 23/49 MS: 1 ChangeBinInt- 00:08:31.954 #54 DONE cov: 11870 ft: 15355 corp: 45/1273b lim: 50 exec/s: 27 rss: 70Mb 00:08:31.954 ###### Recommended dictionary. ###### 00:08:31.954 "\377\377\377\377" # Uses: 3 00:08:31.954 "\001\000\377\377" # Uses: 1 00:08:31.954 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:31.954 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:31.954 ###### End of recommended dictionary. ###### 00:08:31.954 Done 54 runs in 2 second(s) 00:08:32.214 07:33:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:32.214 07:33:42 -- ../common.sh@72 -- # (( i++ )) 00:08:32.214 07:33:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.214 07:33:42 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:32.214 07:33:42 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:32.214 07:33:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.214 07:33:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.214 07:33:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:32.214 07:33:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:32.214 07:33:42 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:32.214 07:33:42 -- nvmf/run.sh@29 -- # port=4422 00:08:32.214 07:33:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:32.214 07:33:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:32.214 07:33:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.214 07:33:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:32.214 [2024-11-28 07:33:42.846304] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:32.214 [2024-11-28 07:33:42.846396] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661826 ] 00:08:32.214 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.474 [2024-11-28 07:33:43.020039] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.474 [2024-11-28 07:33:43.039707] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:32.474 [2024-11-28 07:33:43.039838] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.474 [2024-11-28 07:33:43.091086] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:32.474 [2024-11-28 07:33:43.107447] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:32.474 INFO: Running with entropic power schedule (0xFF, 100). 00:08:32.474 INFO: Seed: 2746966676 00:08:32.474 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:32.474 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:32.474 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:32.474 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.474 #2 INITED exec/s: 0 rss: 59Mb 00:08:32.474 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.474 This may also happen if the target rejected all inputs we tried so far 00:08:32.474 [2024-11-28 07:33:43.152706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.474 [2024-11-28 07:33:43.152736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.474 [2024-11-28 07:33:43.152789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.474 [2024-11-28 07:33:43.152804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.474 [2024-11-28 07:33:43.152856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.474 [2024-11-28 07:33:43.152870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.734 NEW_FUNC[1/672]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:32.734 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.734 #12 NEW cov: 11667 ft: 11670 corp: 2/56b lim: 85 exec/s: 0 rss: 67Mb L: 55/55 MS: 5 ChangeBit-ChangeBit-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:32.734 [2024-11-28 07:33:43.443593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.734 [2024-11-28 07:33:43.443631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.734 [2024-11-28 07:33:43.443686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.734 [2024-11-28 07:33:43.443701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.734 [2024-11-28 07:33:43.443757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.734 [2024-11-28 07:33:43.443772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.734 #14 NEW cov: 11782 ft: 12065 corp: 3/123b lim: 85 exec/s: 0 rss: 67Mb L: 67/67 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:32.734 [2024-11-28 07:33:43.483538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.734 [2024-11-28 07:33:43.483566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.734 [2024-11-28 07:33:43.483625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.734 [2024-11-28 07:33:43.483641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.734 [2024-11-28 07:33:43.483694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.734 [2024-11-28 07:33:43.483710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.994 #15 NEW cov: 11788 ft: 12446 corp: 4/187b lim: 85 exec/s: 0 rss: 67Mb L: 64/67 MS: 1 InsertRepeatedBytes- 00:08:32.994 [2024-11-28 07:33:43.523643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.994 [2024-11-28 07:33:43.523670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.994 [2024-11-28 07:33:43.523720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.994 [2024-11-28 07:33:43.523736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.994 [2024-11-28 07:33:43.523786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.994 [2024-11-28 07:33:43.523801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.994 #17 NEW cov: 11873 ft: 12743 corp: 5/250b lim: 85 exec/s: 0 rss: 67Mb L: 63/67 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:32.994 [2024-11-28 07:33:43.563785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.994 [2024-11-28 07:33:43.563813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.994 [2024-11-28 07:33:43.563854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.994 [2024-11-28 07:33:43.563870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.994 [2024-11-28 07:33:43.563925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.994 [2024-11-28 07:33:43.563939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.994 #18 NEW cov: 11873 ft: 12833 corp: 6/313b lim: 85 exec/s: 0 rss: 67Mb L: 63/67 MS: 1 ShuffleBytes- 00:08:32.994 [2024-11-28 07:33:43.603786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.994 [2024-11-28 07:33:43.603814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.994 [2024-11-28 07:33:43.603865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.994 [2024-11-28 07:33:43.603881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.994 #19 NEW cov: 11873 ft: 13248 corp: 7/347b lim: 85 exec/s: 0 rss: 67Mb L: 34/67 MS: 1 InsertRepeatedBytes- 00:08:32.994 [2024-11-28 07:33:43.643756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.994 [2024-11-28 07:33:43.643783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.994 #20 NEW cov: 11873 ft: 14118 corp: 8/380b lim: 85 exec/s: 0 rss: 67Mb L: 33/67 MS: 1 EraseBytes- 00:08:32.994 [2024-11-28 07:33:43.694164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.994 [2024-11-28 07:33:43.694190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.994 [2024-11-28 07:33:43.694231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.995 [2024-11-28 07:33:43.694245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 07:33:43.694298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.995 [2024-11-28 07:33:43.694313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.995 #21 NEW cov: 11873 ft: 14242 corp: 9/443b lim: 85 exec/s: 0 rss: 67Mb L: 63/67 MS: 1 ChangeBit- 00:08:32.995 [2024-11-28 07:33:43.734261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:32.995 [2024-11-28 07:33:43.734289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 07:33:43.734329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:32.995 [2024-11-28 07:33:43.734344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 07:33:43.734397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:32.995 [2024-11-28 07:33:43.734412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.995 #22 NEW cov: 11873 ft: 14265 corp: 10/506b lim: 85 exec/s: 0 rss: 67Mb L: 63/67 MS: 1 ChangeBit- 00:08:33.254 [2024-11-28 07:33:43.774394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.254 [2024-11-28 07:33:43.774421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.774479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.254 [2024-11-28 07:33:43.774496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.774550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.254 [2024-11-28 07:33:43.774565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.254 #23 NEW cov: 11873 ft: 14320 corp: 11/570b lim: 85 exec/s: 0 rss: 67Mb L: 64/67 MS: 1 ShuffleBytes- 00:08:33.254 [2024-11-28 07:33:43.814731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.254 [2024-11-28 07:33:43.814759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.814809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.254 [2024-11-28 07:33:43.814825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.814876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.254 [2024-11-28 07:33:43.814891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.814944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.254 [2024-11-28 07:33:43.814960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.254 #24 NEW cov: 11873 ft: 14817 corp: 12/644b lim: 85 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 CopyPart- 00:08:33.254 [2024-11-28 07:33:43.854492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.254 [2024-11-28 07:33:43.854520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.854559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.254 [2024-11-28 07:33:43.854575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.254 #25 NEW cov: 11873 ft: 14847 corp: 13/678b lim: 85 exec/s: 0 rss: 67Mb L: 34/74 MS: 1 CopyPart- 00:08:33.254 [2024-11-28 07:33:43.894912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.254 [2024-11-28 07:33:43.894944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.894988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.254 [2024-11-28 07:33:43.895004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.895057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.254 [2024-11-28 07:33:43.895072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 07:33:43.895125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.254 [2024-11-28 07:33:43.895141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.254 #26 NEW cov: 11873 ft: 14868 corp: 14/750b lim: 85 exec/s: 0 rss: 67Mb L: 72/74 MS: 1 CMP- DE: "j\352\210\204\366\373\222\000"- 00:08:33.255 [2024-11-28 07:33:43.934885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.255 [2024-11-28 07:33:43.934912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:43.934951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.255 [2024-11-28 07:33:43.934966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:43.935022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.255 [2024-11-28 07:33:43.935038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.255 #27 NEW cov: 11873 ft: 14913 corp: 15/805b lim: 85 exec/s: 0 rss: 67Mb L: 55/74 MS: 1 ChangeBit- 00:08:33.255 [2024-11-28 07:33:43.975026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.255 [2024-11-28 07:33:43.975053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:43.975092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.255 [2024-11-28 07:33:43.975107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:43.975163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.255 [2024-11-28 07:33:43.975179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.255 #28 NEW cov: 11873 ft: 14942 corp: 16/869b lim: 85 exec/s: 0 rss: 68Mb L: 64/74 MS: 1 ChangeByte- 00:08:33.255 [2024-11-28 07:33:44.015252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.255 [2024-11-28 07:33:44.015280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:44.015329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.255 [2024-11-28 07:33:44.015344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:44.015396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.255 [2024-11-28 07:33:44.015411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.255 [2024-11-28 07:33:44.015467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.255 [2024-11-28 07:33:44.015483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.514 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:33.514 #29 NEW cov: 11896 ft: 14954 corp: 17/948b lim: 85 exec/s: 0 rss: 68Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:33.514 [2024-11-28 07:33:44.065392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.514 [2024-11-28 07:33:44.065419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.514 [2024-11-28 07:33:44.065459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.514 [2024-11-28 07:33:44.065471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.514 [2024-11-28 07:33:44.065528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.515 [2024-11-28 07:33:44.065544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.065604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.515 [2024-11-28 07:33:44.065620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.515 #30 NEW cov: 11896 ft: 15014 corp: 18/1022b lim: 85 exec/s: 0 rss: 68Mb L: 74/79 MS: 1 ChangeBinInt- 00:08:33.515 [2024-11-28 07:33:44.105261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.515 [2024-11-28 07:33:44.105288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.105330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.515 [2024-11-28 07:33:44.105345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.515 #31 NEW cov: 11896 ft: 15028 corp: 19/1057b lim: 85 exec/s: 0 rss: 68Mb L: 35/79 MS: 1 InsertByte- 00:08:33.515 [2024-11-28 07:33:44.145541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.515 [2024-11-28 07:33:44.145567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.145609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.515 [2024-11-28 07:33:44.145625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.145680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.515 [2024-11-28 07:33:44.145696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.515 #32 NEW cov: 11896 ft: 15041 corp: 20/1120b lim: 85 exec/s: 32 rss: 68Mb L: 63/79 MS: 1 ChangeBit- 00:08:33.515 [2024-11-28 07:33:44.185638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.515 [2024-11-28 07:33:44.185667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.185708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.515 [2024-11-28 07:33:44.185724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.185783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.515 [2024-11-28 07:33:44.185799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.515 #33 NEW cov: 11896 ft: 15045 corp: 21/1178b lim: 85 exec/s: 33 rss: 68Mb L: 58/79 MS: 1 CopyPart- 00:08:33.515 [2024-11-28 07:33:44.225611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.515 [2024-11-28 07:33:44.225639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.225678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.515 [2024-11-28 07:33:44.225693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.515 #34 NEW cov: 11896 ft: 15096 corp: 22/1212b lim: 85 exec/s: 34 rss: 68Mb L: 34/79 MS: 1 ChangeBit- 00:08:33.515 [2024-11-28 07:33:44.265742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.515 [2024-11-28 07:33:44.265769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.515 [2024-11-28 07:33:44.265822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.515 [2024-11-28 07:33:44.265837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.774 #35 NEW cov: 11896 ft: 15169 corp: 23/1258b lim: 85 exec/s: 35 rss: 68Mb L: 46/79 MS: 1 EraseBytes- 00:08:33.774 [2024-11-28 07:33:44.306153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.774 [2024-11-28 07:33:44.306180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.774 [2024-11-28 07:33:44.306217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.774 [2024-11-28 07:33:44.306232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.774 [2024-11-28 07:33:44.306287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.774 [2024-11-28 07:33:44.306303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.774 [2024-11-28 07:33:44.306359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.774 [2024-11-28 07:33:44.306375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.774 #36 NEW cov: 11896 ft: 15188 corp: 24/1332b lim: 85 exec/s: 36 rss: 68Mb L: 74/79 MS: 1 ChangeBinInt- 00:08:33.774 [2024-11-28 07:33:44.346143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.774 [2024-11-28 07:33:44.346171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.774 [2024-11-28 07:33:44.346211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.774 [2024-11-28 07:33:44.346227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.346282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.775 [2024-11-28 07:33:44.346297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.775 #42 NEW cov: 11896 ft: 15227 corp: 25/1396b lim: 85 exec/s: 42 rss: 68Mb L: 64/79 MS: 1 ChangeBinInt- 00:08:33.775 [2024-11-28 07:33:44.386211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.775 [2024-11-28 07:33:44.386241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.386280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.775 [2024-11-28 07:33:44.386295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.386351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.775 [2024-11-28 07:33:44.386366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.775 #43 NEW cov: 11896 ft: 15307 corp: 26/1451b lim: 85 exec/s: 43 rss: 68Mb L: 55/79 MS: 1 CrossOver- 00:08:33.775 [2024-11-28 07:33:44.426517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.775 [2024-11-28 07:33:44.426543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.426582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.775 [2024-11-28 07:33:44.426602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.426656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.775 [2024-11-28 07:33:44.426671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.426726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:33.775 [2024-11-28 07:33:44.426742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.775 #44 NEW cov: 11896 ft: 15380 corp: 27/1525b lim: 85 exec/s: 44 rss: 68Mb L: 74/79 MS: 1 ChangeByte- 00:08:33.775 [2024-11-28 07:33:44.466443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.775 [2024-11-28 07:33:44.466470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.466509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.775 [2024-11-28 07:33:44.466525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.775 [2024-11-28 07:33:44.466580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:33.775 [2024-11-28 07:33:44.466596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.775 #45 NEW cov: 11896 ft: 15442 corp: 28/1582b lim: 85 exec/s: 45 rss: 68Mb L: 57/79 MS: 1 EraseBytes- 00:08:33.775 [2024-11-28 07:33:44.506257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.775 [2024-11-28 07:33:44.506285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.775 #46 NEW cov: 11896 ft: 15524 corp: 29/1600b lim: 85 exec/s: 46 rss: 68Mb L: 18/79 MS: 1 EraseBytes- 00:08:34.035 [2024-11-28 07:33:44.546707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.546733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.546772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.035 [2024-11-28 07:33:44.546788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.546845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.035 [2024-11-28 07:33:44.546860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.035 #47 NEW cov: 11896 ft: 15592 corp: 30/1664b lim: 85 exec/s: 47 rss: 68Mb L: 64/79 MS: 1 ChangeASCIIInt- 00:08:34.035 [2024-11-28 07:33:44.586943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.586970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.587008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.035 [2024-11-28 07:33:44.587024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.587080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.035 [2024-11-28 07:33:44.587095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.587151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:34.035 [2024-11-28 07:33:44.587166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.035 #48 NEW cov: 11896 ft: 15598 corp: 31/1748b lim: 85 exec/s: 48 rss: 68Mb L: 84/84 MS: 1 CrossOver- 00:08:34.035 [2024-11-28 07:33:44.626751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.626777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.626820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.035 [2024-11-28 07:33:44.626836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.035 #49 NEW cov: 11896 ft: 15631 corp: 32/1789b lim: 85 exec/s: 49 rss: 68Mb L: 41/84 MS: 1 EraseBytes- 00:08:34.035 [2024-11-28 07:33:44.666879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.666907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.666944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.035 [2024-11-28 07:33:44.666959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.035 #50 NEW cov: 11896 ft: 15650 corp: 33/1823b lim: 85 exec/s: 50 rss: 68Mb L: 34/84 MS: 1 ChangeByte- 00:08:34.035 [2024-11-28 07:33:44.706829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.706855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 #51 NEW cov: 11896 ft: 15657 corp: 34/1841b lim: 85 exec/s: 51 rss: 68Mb L: 18/84 MS: 1 ShuffleBytes- 00:08:34.035 [2024-11-28 07:33:44.747141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.747167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.747206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.035 [2024-11-28 07:33:44.747222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.035 #52 NEW cov: 11896 ft: 15676 corp: 35/1875b lim: 85 exec/s: 52 rss: 69Mb L: 34/84 MS: 1 CMP- DE: "\001\000\000\000\377\377\377\377"- 00:08:34.035 [2024-11-28 07:33:44.787235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.035 [2024-11-28 07:33:44.787261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.035 [2024-11-28 07:33:44.787294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.035 [2024-11-28 07:33:44.787310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.295 #53 NEW cov: 11896 ft: 15679 corp: 36/1921b lim: 85 exec/s: 53 rss: 69Mb L: 46/84 MS: 1 ChangeByte- 00:08:34.295 [2024-11-28 07:33:44.827688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.295 [2024-11-28 07:33:44.827715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.827767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.295 [2024-11-28 07:33:44.827781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.827835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.295 [2024-11-28 07:33:44.827851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.827903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:34.295 [2024-11-28 07:33:44.827918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.295 #54 NEW cov: 11896 ft: 15686 corp: 37/1989b lim: 85 exec/s: 54 rss: 69Mb L: 68/84 MS: 1 CMP- DE: "\001\000\000\014"- 00:08:34.295 [2024-11-28 07:33:44.867484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.295 [2024-11-28 07:33:44.867511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.867574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.295 [2024-11-28 07:33:44.867589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.295 #55 NEW cov: 11896 ft: 15700 corp: 38/2023b lim: 85 exec/s: 55 rss: 69Mb L: 34/84 MS: 1 ShuffleBytes- 00:08:34.295 [2024-11-28 07:33:44.907560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.295 [2024-11-28 07:33:44.907587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.907630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.295 [2024-11-28 07:33:44.907645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.295 #56 NEW cov: 11896 ft: 15708 corp: 39/2057b lim: 85 exec/s: 56 rss: 69Mb L: 34/84 MS: 1 CrossOver- 00:08:34.295 [2024-11-28 07:33:44.937793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.295 [2024-11-28 07:33:44.937820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.937864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.295 [2024-11-28 07:33:44.937879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.937935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.295 [2024-11-28 07:33:44.937949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.295 #57 NEW cov: 11896 ft: 15763 corp: 40/2120b lim: 85 exec/s: 57 rss: 69Mb L: 63/84 MS: 1 ChangeBinInt- 00:08:34.295 [2024-11-28 07:33:44.977741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.295 [2024-11-28 07:33:44.977767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.295 [2024-11-28 07:33:44.977808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.295 [2024-11-28 07:33:44.977823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.295 #58 NEW cov: 11896 ft: 15779 corp: 41/2154b lim: 85 exec/s: 58 rss: 69Mb L: 34/84 MS: 1 ChangeBinInt- 00:08:34.296 [2024-11-28 07:33:45.018143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.296 [2024-11-28 07:33:45.018171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.296 [2024-11-28 07:33:45.018215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.296 [2024-11-28 07:33:45.018232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.296 [2024-11-28 07:33:45.018263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.296 [2024-11-28 07:33:45.018278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.296 [2024-11-28 07:33:45.018334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:34.296 [2024-11-28 07:33:45.018350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.296 [2024-11-28 07:33:45.057965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.296 [2024-11-28 07:33:45.057991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.296 [2024-11-28 07:33:45.058029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.296 [2024-11-28 07:33:45.058045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.554 #60 NEW cov: 11896 ft: 15822 corp: 42/2196b lim: 85 exec/s: 60 rss: 69Mb L: 42/84 MS: 2 ChangeBinInt-EraseBytes- 00:08:34.554 [2024-11-28 07:33:45.098076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.554 [2024-11-28 07:33:45.098102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.554 [2024-11-28 07:33:45.098154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.554 [2024-11-28 07:33:45.098170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.554 #61 NEW cov: 11896 ft: 15857 corp: 43/2242b lim: 85 exec/s: 61 rss: 69Mb L: 46/84 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:34.554 [2024-11-28 07:33:45.138369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.554 [2024-11-28 07:33:45.138396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.554 [2024-11-28 07:33:45.138445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.554 [2024-11-28 07:33:45.138461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.554 [2024-11-28 07:33:45.138516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.554 [2024-11-28 07:33:45.138532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.554 #62 NEW cov: 11896 ft: 15882 corp: 44/2305b lim: 85 exec/s: 31 rss: 69Mb L: 63/84 MS: 1 ChangeByte- 00:08:34.554 #62 DONE cov: 11896 ft: 15882 corp: 44/2305b lim: 85 exec/s: 31 rss: 69Mb 00:08:34.554 ###### Recommended dictionary. ###### 00:08:34.554 "j\352\210\204\366\373\222\000" # Uses: 1 00:08:34.555 "\001\000\000\000\377\377\377\377" # Uses: 0 00:08:34.555 "\001\000\000\014" # Uses: 0 00:08:34.555 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:34.555 ###### End of recommended dictionary. ###### 00:08:34.555 Done 62 runs in 2 second(s) 00:08:34.555 07:33:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:34.555 07:33:45 -- ../common.sh@72 -- # (( i++ )) 00:08:34.555 07:33:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.555 07:33:45 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:34.555 07:33:45 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:34.555 07:33:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:34.555 07:33:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:34.555 07:33:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:34.555 07:33:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:34.555 07:33:45 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:34.555 07:33:45 -- nvmf/run.sh@29 -- # port=4423 00:08:34.555 07:33:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:34.555 07:33:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:34.555 07:33:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:34.555 07:33:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:34.555 [2024-11-28 07:33:45.311951] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:34.555 [2024-11-28 07:33:45.312043] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661865 ] 00:08:34.814 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.814 [2024-11-28 07:33:45.491177] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.814 [2024-11-28 07:33:45.510508] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:34.814 [2024-11-28 07:33:45.510649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.814 [2024-11-28 07:33:45.561871] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.814 [2024-11-28 07:33:45.578222] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:35.073 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.073 INFO: Seed: 923994709 00:08:35.073 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:35.073 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:35.073 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:35.073 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.073 #2 INITED exec/s: 0 rss: 59Mb 00:08:35.073 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.073 This may also happen if the target rejected all inputs we tried so far 00:08:35.073 [2024-11-28 07:33:45.622950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.073 [2024-11-28 07:33:45.622985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.073 [2024-11-28 07:33:45.623019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.073 [2024-11-28 07:33:45.623037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.073 [2024-11-28 07:33:45.623067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.073 [2024-11-28 07:33:45.623083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.073 [2024-11-28 07:33:45.623111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.074 [2024-11-28 07:33:45.623127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.074 [2024-11-28 07:33:45.623155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.074 [2024-11-28 07:33:45.623171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.333 NEW_FUNC[1/670]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:35.333 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:35.333 #3 NEW cov: 11596 ft: 11603 corp: 2/26b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:35.333 [2024-11-28 07:33:45.943636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.333 [2024-11-28 07:33:45.943673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.333 [2024-11-28 07:33:45.943722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.333 [2024-11-28 07:33:45.943740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.333 [2024-11-28 07:33:45.943770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.333 [2024-11-28 07:33:45.943785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:45.943813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.334 [2024-11-28 07:33:45.943829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:45.943857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.334 [2024-11-28 07:33:45.943873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.334 NEW_FUNC[1/1]: 0x12b1fa8 in nvmf_tcp_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3336 00:08:35.334 #4 NEW cov: 11715 ft: 12138 corp: 3/51b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:35.334 [2024-11-28 07:33:46.013738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.334 [2024-11-28 07:33:46.013768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.013815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.334 [2024-11-28 07:33:46.013833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.013867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.334 [2024-11-28 07:33:46.013883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.013911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.334 [2024-11-28 07:33:46.013927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.013955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.334 [2024-11-28 07:33:46.013970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.334 #5 NEW cov: 11721 ft: 12334 corp: 4/76b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CrossOver- 00:08:35.334 [2024-11-28 07:33:46.063878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.334 [2024-11-28 07:33:46.063907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.063954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.334 [2024-11-28 07:33:46.063972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.064002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.334 [2024-11-28 07:33:46.064018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.064045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.334 [2024-11-28 07:33:46.064061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.334 [2024-11-28 07:33:46.064090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.334 [2024-11-28 07:33:46.064105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.594 #6 NEW cov: 11806 ft: 12661 corp: 5/101b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:35.594 [2024-11-28 07:33:46.134053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.594 [2024-11-28 07:33:46.134082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.134130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.594 [2024-11-28 07:33:46.134147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.134176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.594 [2024-11-28 07:33:46.134192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.134220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.594 [2024-11-28 07:33:46.134236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.134264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.594 [2024-11-28 07:33:46.134279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.594 #12 NEW cov: 11806 ft: 12749 corp: 6/126b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:35.594 [2024-11-28 07:33:46.194204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.594 [2024-11-28 07:33:46.194233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.194281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.594 [2024-11-28 07:33:46.194298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.194327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.594 [2024-11-28 07:33:46.194343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.194371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.594 [2024-11-28 07:33:46.194386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.194414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.594 [2024-11-28 07:33:46.194430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.594 #13 NEW cov: 11806 ft: 12835 corp: 7/151b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:35.594 [2024-11-28 07:33:46.244123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.594 [2024-11-28 07:33:46.244152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.594 #17 NEW cov: 11806 ft: 13607 corp: 8/158b lim: 25 exec/s: 0 rss: 67Mb L: 7/25 MS: 4 CopyPart-InsertByte-CopyPart-InsertRepeatedBytes- 00:08:35.594 [2024-11-28 07:33:46.304468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.594 [2024-11-28 07:33:46.304498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.304544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.594 [2024-11-28 07:33:46.304561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.304591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.594 [2024-11-28 07:33:46.304615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.304643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.594 [2024-11-28 07:33:46.304659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.304687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.594 [2024-11-28 07:33:46.304703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.594 #18 NEW cov: 11806 ft: 13654 corp: 9/183b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CrossOver- 00:08:35.594 [2024-11-28 07:33:46.354592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.594 [2024-11-28 07:33:46.354626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.354673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.594 [2024-11-28 07:33:46.354694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.354724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.594 [2024-11-28 07:33:46.354740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.354767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.594 [2024-11-28 07:33:46.354782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.594 [2024-11-28 07:33:46.354810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.594 [2024-11-28 07:33:46.354826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.854 #19 NEW cov: 11806 ft: 13690 corp: 10/208b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:35.854 [2024-11-28 07:33:46.414622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.854 [2024-11-28 07:33:46.414651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.854 #20 NEW cov: 11806 ft: 13716 corp: 11/215b lim: 25 exec/s: 0 rss: 67Mb L: 7/25 MS: 1 ChangeByte- 00:08:35.854 [2024-11-28 07:33:46.474931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.854 [2024-11-28 07:33:46.474961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.475008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.854 [2024-11-28 07:33:46.475026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.475056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.854 [2024-11-28 07:33:46.475073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.475101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.854 [2024-11-28 07:33:46.475117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.854 #21 NEW cov: 11806 ft: 13765 corp: 12/236b lim: 25 exec/s: 0 rss: 68Mb L: 21/25 MS: 1 InsertRepeatedBytes- 00:08:35.854 [2024-11-28 07:33:46.525060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.854 [2024-11-28 07:33:46.525091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.525137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:35.854 [2024-11-28 07:33:46.525155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.525184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:35.854 [2024-11-28 07:33:46.525200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.525228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:35.854 [2024-11-28 07:33:46.525244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.854 [2024-11-28 07:33:46.525272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:35.854 [2024-11-28 07:33:46.525292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.854 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:35.854 #22 NEW cov: 11823 ft: 13824 corp: 13/261b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 ChangeByte- 00:08:35.854 [2024-11-28 07:33:46.575041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.854 [2024-11-28 07:33:46.575072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 #23 NEW cov: 11823 ft: 13845 corp: 14/269b lim: 25 exec/s: 23 rss: 68Mb L: 8/25 MS: 1 InsertByte- 00:08:36.114 [2024-11-28 07:33:46.645392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.114 [2024-11-28 07:33:46.645423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.645469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.114 [2024-11-28 07:33:46.645487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.645516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.114 [2024-11-28 07:33:46.645532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.645560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.114 [2024-11-28 07:33:46.645575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.645611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.114 [2024-11-28 07:33:46.645628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.114 #24 NEW cov: 11823 ft: 13927 corp: 15/294b lim: 25 exec/s: 24 rss: 68Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:36.114 [2024-11-28 07:33:46.695455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.114 [2024-11-28 07:33:46.695485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.695533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.114 [2024-11-28 07:33:46.695550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.695579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.114 [2024-11-28 07:33:46.695595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.695631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.114 [2024-11-28 07:33:46.695648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.114 #25 NEW cov: 11823 ft: 13960 corp: 16/315b lim: 25 exec/s: 25 rss: 68Mb L: 21/25 MS: 1 CopyPart- 00:08:36.114 [2024-11-28 07:33:46.756127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.114 [2024-11-28 07:33:46.756154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.756201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.114 [2024-11-28 07:33:46.756222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.114 #26 NEW cov: 11823 ft: 14255 corp: 17/327b lim: 25 exec/s: 26 rss: 68Mb L: 12/25 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:36.114 [2024-11-28 07:33:46.796138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.114 [2024-11-28 07:33:46.796165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 #31 NEW cov: 11823 ft: 14357 corp: 18/332b lim: 25 exec/s: 31 rss: 68Mb L: 5/25 MS: 5 ChangeByte-ShuffleBytes-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:36.114 [2024-11-28 07:33:46.836558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.114 [2024-11-28 07:33:46.836586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.836637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.114 [2024-11-28 07:33:46.836653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.836701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.114 [2024-11-28 07:33:46.836715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.836767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.114 [2024-11-28 07:33:46.836780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.114 #32 NEW cov: 11823 ft: 14384 corp: 19/356b lim: 25 exec/s: 32 rss: 68Mb L: 24/25 MS: 1 CrossOver- 00:08:36.114 [2024-11-28 07:33:46.876863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.114 [2024-11-28 07:33:46.876889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.876940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.114 [2024-11-28 07:33:46.876955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.877006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.114 [2024-11-28 07:33:46.877021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.877075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.114 [2024-11-28 07:33:46.877090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.114 [2024-11-28 07:33:46.877143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.114 [2024-11-28 07:33:46.877158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.374 #33 NEW cov: 11823 ft: 14388 corp: 20/381b lim: 25 exec/s: 33 rss: 68Mb L: 25/25 MS: 1 ChangeBit- 00:08:36.374 [2024-11-28 07:33:46.916947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.374 [2024-11-28 07:33:46.916975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.917040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.374 [2024-11-28 07:33:46.917055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.917110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.374 [2024-11-28 07:33:46.917126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.917177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.374 [2024-11-28 07:33:46.917192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.917244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.374 [2024-11-28 07:33:46.917259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.374 #34 NEW cov: 11823 ft: 14395 corp: 21/406b lim: 25 exec/s: 34 rss: 68Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:36.374 [2024-11-28 07:33:46.957068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.374 [2024-11-28 07:33:46.957095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.957145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.374 [2024-11-28 07:33:46.957160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.957210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.374 [2024-11-28 07:33:46.957225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.957278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.374 [2024-11-28 07:33:46.957292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.957344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.374 [2024-11-28 07:33:46.957359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.374 #35 NEW cov: 11823 ft: 14425 corp: 22/431b lim: 25 exec/s: 35 rss: 68Mb L: 25/25 MS: 1 ChangeBit- 00:08:36.374 [2024-11-28 07:33:46.996846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.374 [2024-11-28 07:33:46.996872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:46.996913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.374 [2024-11-28 07:33:46.996927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.374 #36 NEW cov: 11823 ft: 14448 corp: 23/441b lim: 25 exec/s: 36 rss: 68Mb L: 10/25 MS: 1 CopyPart- 00:08:36.374 [2024-11-28 07:33:47.037280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.374 [2024-11-28 07:33:47.037306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.037362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.374 [2024-11-28 07:33:47.037375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.037430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.374 [2024-11-28 07:33:47.037448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.037504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.374 [2024-11-28 07:33:47.037519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.037574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.374 [2024-11-28 07:33:47.037590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.374 #37 NEW cov: 11823 ft: 14478 corp: 24/466b lim: 25 exec/s: 37 rss: 68Mb L: 25/25 MS: 1 CopyPart- 00:08:36.374 [2024-11-28 07:33:47.077409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.374 [2024-11-28 07:33:47.077437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.077491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.374 [2024-11-28 07:33:47.077507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.077562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.374 [2024-11-28 07:33:47.077577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.077631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.374 [2024-11-28 07:33:47.077646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.077697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.374 [2024-11-28 07:33:47.077711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.374 #38 NEW cov: 11823 ft: 14557 corp: 25/491b lim: 25 exec/s: 38 rss: 68Mb L: 25/25 MS: 1 CMP- DE: "\270\223\312]\370\373\222\000"- 00:08:36.374 [2024-11-28 07:33:47.117418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.374 [2024-11-28 07:33:47.117444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.117491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.374 [2024-11-28 07:33:47.117507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.117560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.374 [2024-11-28 07:33:47.117576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.374 [2024-11-28 07:33:47.117631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.374 [2024-11-28 07:33:47.117645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.634 #39 NEW cov: 11823 ft: 14656 corp: 26/515b lim: 25 exec/s: 39 rss: 68Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:36.634 [2024-11-28 07:33:47.157319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.634 [2024-11-28 07:33:47.157345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.157391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.634 [2024-11-28 07:33:47.157410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.634 #40 NEW cov: 11823 ft: 14664 corp: 27/527b lim: 25 exec/s: 40 rss: 68Mb L: 12/25 MS: 1 CrossOver- 00:08:36.634 [2024-11-28 07:33:47.197789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.634 [2024-11-28 07:33:47.197816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.197865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.634 [2024-11-28 07:33:47.197881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.197934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.634 [2024-11-28 07:33:47.197950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.198004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.634 [2024-11-28 07:33:47.198018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.198074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.634 [2024-11-28 07:33:47.198089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.634 #41 NEW cov: 11823 ft: 14669 corp: 28/552b lim: 25 exec/s: 41 rss: 68Mb L: 25/25 MS: 1 ChangeBit- 00:08:36.634 [2024-11-28 07:33:47.237737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.634 [2024-11-28 07:33:47.237765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.237812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.634 [2024-11-28 07:33:47.237826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.237877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.634 [2024-11-28 07:33:47.237892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.237945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.634 [2024-11-28 07:33:47.237959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.634 #42 NEW cov: 11823 ft: 14697 corp: 29/576b lim: 25 exec/s: 42 rss: 68Mb L: 24/25 MS: 1 ChangeByte- 00:08:36.634 [2024-11-28 07:33:47.277534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.634 [2024-11-28 07:33:47.277561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.634 #43 NEW cov: 11823 ft: 14708 corp: 30/581b lim: 25 exec/s: 43 rss: 69Mb L: 5/25 MS: 1 ChangeByte- 00:08:36.634 [2024-11-28 07:33:47.318058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.634 [2024-11-28 07:33:47.318085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.318138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.634 [2024-11-28 07:33:47.318154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.318210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.634 [2024-11-28 07:33:47.318225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.318279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.634 [2024-11-28 07:33:47.318294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.318346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.634 [2024-11-28 07:33:47.318360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.634 #44 NEW cov: 11823 ft: 14711 corp: 31/606b lim: 25 exec/s: 44 rss: 69Mb L: 25/25 MS: 1 ChangeBit- 00:08:36.634 [2024-11-28 07:33:47.358060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.634 [2024-11-28 07:33:47.358087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.358135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.634 [2024-11-28 07:33:47.358150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.358204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.634 [2024-11-28 07:33:47.358218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.634 [2024-11-28 07:33:47.358270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.634 [2024-11-28 07:33:47.358285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.634 #45 NEW cov: 11823 ft: 14723 corp: 32/630b lim: 25 exec/s: 45 rss: 69Mb L: 24/25 MS: 1 CMP- DE: "\001\010"- 00:08:36.635 [2024-11-28 07:33:47.398335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.635 [2024-11-28 07:33:47.398361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.635 [2024-11-28 07:33:47.398417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.635 [2024-11-28 07:33:47.398431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.635 [2024-11-28 07:33:47.398485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.635 [2024-11-28 07:33:47.398500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.635 [2024-11-28 07:33:47.398578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.635 [2024-11-28 07:33:47.398594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.635 [2024-11-28 07:33:47.398653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.635 [2024-11-28 07:33:47.398668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.894 #46 NEW cov: 11823 ft: 14775 corp: 33/655b lim: 25 exec/s: 46 rss: 69Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:36.894 [2024-11-28 07:33:47.438472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.894 [2024-11-28 07:33:47.438499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.894 [2024-11-28 07:33:47.438551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.894 [2024-11-28 07:33:47.438567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.894 [2024-11-28 07:33:47.438622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.894 [2024-11-28 07:33:47.438637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.438689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.895 [2024-11-28 07:33:47.438704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.438755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.895 [2024-11-28 07:33:47.438771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.895 #47 NEW cov: 11823 ft: 14783 corp: 34/680b lim: 25 exec/s: 47 rss: 69Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:36.895 [2024-11-28 07:33:47.478545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.895 [2024-11-28 07:33:47.478572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.478648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.895 [2024-11-28 07:33:47.478662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.478714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.895 [2024-11-28 07:33:47.478729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.478781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.895 [2024-11-28 07:33:47.478796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.478851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.895 [2024-11-28 07:33:47.478866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.895 #48 NEW cov: 11823 ft: 14796 corp: 35/705b lim: 25 exec/s: 48 rss: 69Mb L: 25/25 MS: 1 CopyPart- 00:08:36.895 [2024-11-28 07:33:47.518539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.895 [2024-11-28 07:33:47.518565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.518632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.895 [2024-11-28 07:33:47.518648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.518698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.895 [2024-11-28 07:33:47.518713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.518765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.895 [2024-11-28 07:33:47.518780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.895 #49 NEW cov: 11830 ft: 14819 corp: 36/729b lim: 25 exec/s: 49 rss: 69Mb L: 24/25 MS: 1 CMP- DE: "\000\004\000\000\000\000\000\000"- 00:08:36.895 [2024-11-28 07:33:47.558805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.895 [2024-11-28 07:33:47.558833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.558887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.895 [2024-11-28 07:33:47.558903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.558956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.895 [2024-11-28 07:33:47.558971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.559024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.895 [2024-11-28 07:33:47.559039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.559092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:36.895 [2024-11-28 07:33:47.559106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.895 #50 NEW cov: 11830 ft: 14841 corp: 37/754b lim: 25 exec/s: 50 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:08:36.895 [2024-11-28 07:33:47.598901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.895 [2024-11-28 07:33:47.598931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.598968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.895 [2024-11-28 07:33:47.598984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.599039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.895 [2024-11-28 07:33:47.599055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.895 [2024-11-28 07:33:47.599110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.895 [2024-11-28 07:33:47.599125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.895 #51 NEW cov: 11830 ft: 14857 corp: 38/775b lim: 25 exec/s: 25 rss: 69Mb L: 21/25 MS: 1 PersAutoDict- DE: "\000\004\000\000\000\000\000\000"- 00:08:36.895 #51 DONE cov: 11830 ft: 14857 corp: 38/775b lim: 25 exec/s: 25 rss: 69Mb 00:08:36.895 ###### Recommended dictionary. ###### 00:08:36.895 "\001\000\000\000" # Uses: 0 00:08:36.895 "\270\223\312]\370\373\222\000" # Uses: 0 00:08:36.895 "\001\010" # Uses: 0 00:08:36.895 "\000\004\000\000\000\000\000\000" # Uses: 1 00:08:36.895 ###### End of recommended dictionary. ###### 00:08:36.895 Done 51 runs in 2 second(s) 00:08:37.155 07:33:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:37.155 07:33:47 -- ../common.sh@72 -- # (( i++ )) 00:08:37.155 07:33:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.155 07:33:47 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:37.155 07:33:47 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:37.155 07:33:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:37.155 07:33:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.155 07:33:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:37.155 07:33:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:37.155 07:33:47 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:37.155 07:33:47 -- nvmf/run.sh@29 -- # port=4424 00:08:37.155 07:33:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:37.155 07:33:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:37.155 07:33:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.155 07:33:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:37.155 [2024-11-28 07:33:47.774165] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:37.155 [2024-11-28 07:33:47.774259] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1661919 ] 00:08:37.155 EAL: No free 2048 kB hugepages reported on node 1 00:08:37.414 [2024-11-28 07:33:47.948705] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.414 [2024-11-28 07:33:47.968618] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:37.414 [2024-11-28 07:33:47.968756] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.414 [2024-11-28 07:33:48.020176] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:37.414 [2024-11-28 07:33:48.036533] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:37.414 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.414 INFO: Seed: 3380996955 00:08:37.414 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:37.415 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:37.415 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:37.415 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.415 #2 INITED exec/s: 0 rss: 59Mb 00:08:37.415 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.415 This may also happen if the target rejected all inputs we tried so far 00:08:37.415 [2024-11-28 07:33:48.081774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.415 [2024-11-28 07:33:48.081805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.415 [2024-11-28 07:33:48.081857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.415 [2024-11-28 07:33:48.081872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.415 [2024-11-28 07:33:48.081924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.415 [2024-11-28 07:33:48.081940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.674 NEW_FUNC[1/672]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:37.674 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.674 #6 NEW cov: 11673 ft: 11675 corp: 2/69b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 4 CopyPart-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:37.674 [2024-11-28 07:33:48.392657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-28 07:33:48.392690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.674 [2024-11-28 07:33:48.392746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-28 07:33:48.392762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.674 [2024-11-28 07:33:48.392816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-28 07:33:48.392830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.674 #7 NEW cov: 11787 ft: 12107 corp: 3/137b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBinInt- 00:08:37.674 [2024-11-28 07:33:48.442533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-28 07:33:48.442560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.674 [2024-11-28 07:33:48.442616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1229782938247303441 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-28 07:33:48.442632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.934 #12 NEW cov: 11793 ft: 12685 corp: 4/187b lim: 100 exec/s: 0 rss: 67Mb L: 50/68 MS: 5 ShuffleBytes-ChangeBit-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:37.934 [2024-11-28 07:33:48.482808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.482835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.934 [2024-11-28 07:33:48.482873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.482889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.934 [2024-11-28 07:33:48.482943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.482958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.934 #13 NEW cov: 11878 ft: 12914 corp: 5/255b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 CopyPart- 00:08:37.934 [2024-11-28 07:33:48.522880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.522909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.934 [2024-11-28 07:33:48.522948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.522963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.934 [2024-11-28 07:33:48.523017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.523032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.934 #14 NEW cov: 11878 ft: 12965 corp: 6/323b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 CMP- DE: "\005\000\000\000\000\000\000\000"- 00:08:37.934 [2024-11-28 07:33:48.562998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.563025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.934 [2024-11-28 07:33:48.563066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-28 07:33:48.563081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.934 [2024-11-28 07:33:48.563133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1229782938247303441 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.563148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.935 #15 NEW cov: 11878 ft: 13179 corp: 7/384b lim: 100 exec/s: 0 rss: 67Mb L: 61/68 MS: 1 InsertRepeatedBytes- 00:08:37.935 [2024-11-28 07:33:48.603114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.603142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.935 [2024-11-28 07:33:48.603195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.603211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.935 [2024-11-28 07:33:48.603263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.603280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.935 #16 NEW cov: 11878 ft: 13242 corp: 8/452b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBit- 00:08:37.935 [2024-11-28 07:33:48.643079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.643106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.935 [2024-11-28 07:33:48.643148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.643163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.935 #17 NEW cov: 11878 ft: 13294 corp: 9/497b lim: 100 exec/s: 0 rss: 67Mb L: 45/68 MS: 1 EraseBytes- 00:08:37.935 [2024-11-28 07:33:48.683368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057633131331583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.683394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.935 [2024-11-28 07:33:48.683434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.683451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.935 [2024-11-28 07:33:48.683504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.935 [2024-11-28 07:33:48.683519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.935 #18 NEW cov: 11878 ft: 13325 corp: 10/565b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBinInt- 00:08:38.194 [2024-11-28 07:33:48.723444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-28 07:33:48.723472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.194 [2024-11-28 07:33:48.723510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-28 07:33:48.723525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.194 [2024-11-28 07:33:48.723579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-28 07:33:48.723594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.194 #19 NEW cov: 11878 ft: 13362 corp: 11/633b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBinInt- 00:08:38.195 [2024-11-28 07:33:48.763561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057633131331583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.763588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.763633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.763649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.763717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.763734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.195 #20 NEW cov: 11878 ft: 13421 corp: 12/701b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBit- 00:08:38.195 [2024-11-28 07:33:48.803700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057633131331583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.803728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.803765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.803780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.803832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.803847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.195 #21 NEW cov: 11878 ft: 13461 corp: 13/769b lim: 100 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 ChangeBit- 00:08:38.195 [2024-11-28 07:33:48.843691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.843718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.843764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.843784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.195 #22 NEW cov: 11878 ft: 13486 corp: 14/823b lim: 100 exec/s: 0 rss: 68Mb L: 54/68 MS: 1 EraseBytes- 00:08:38.195 [2024-11-28 07:33:48.883927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.883954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.883993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.884009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.884064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:83886080 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.884080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.195 #23 NEW cov: 11878 ft: 13509 corp: 15/884b lim: 100 exec/s: 0 rss: 68Mb L: 61/68 MS: 1 PersAutoDict- DE: "\005\000\000\000\000\000\000\000"- 00:08:38.195 [2024-11-28 07:33:48.924007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.924035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.924073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2738188573441261338 len:65528 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.924089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.924146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.924161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.195 #24 NEW cov: 11878 ft: 13527 corp: 16/944b lim: 100 exec/s: 0 rss: 68Mb L: 60/68 MS: 1 CrossOver- 00:08:38.195 [2024-11-28 07:33:48.964155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.964182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.195 [2024-11-28 07:33:48.964221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.195 [2024-11-28 07:33:48.964237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:48.964292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:48.964309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.454 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:38.454 #25 NEW cov: 11901 ft: 13584 corp: 17/1012b lim: 100 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 CMP- DE: "\001\000"- 00:08:38.454 [2024-11-28 07:33:49.004113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.004143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.004196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.004212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.454 #26 NEW cov: 11901 ft: 13642 corp: 18/1067b lim: 100 exec/s: 0 rss: 68Mb L: 55/68 MS: 1 CrossOver- 00:08:38.454 [2024-11-28 07:33:49.044413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.044440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.044489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743747292037119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.044505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.044561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.044577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.454 #27 NEW cov: 11901 ft: 13710 corp: 19/1135b lim: 100 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 ChangeByte- 00:08:38.454 [2024-11-28 07:33:49.084505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17870283317549858815 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.084533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.084587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.084609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.084664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.084680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.454 #28 NEW cov: 11901 ft: 13723 corp: 20/1203b lim: 100 exec/s: 28 rss: 68Mb L: 68/68 MS: 1 ChangeBit- 00:08:38.454 [2024-11-28 07:33:49.124611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.124638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.124683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.124698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.124751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:18 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.124766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.454 #29 NEW cov: 11901 ft: 13745 corp: 21/1278b lim: 100 exec/s: 29 rss: 68Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:38.454 [2024-11-28 07:33:49.164716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.164747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.454 [2024-11-28 07:33:49.164809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.454 [2024-11-28 07:33:49.164825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.455 [2024-11-28 07:33:49.164876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.455 [2024-11-28 07:33:49.164891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.455 #30 NEW cov: 11901 ft: 13776 corp: 22/1341b lim: 100 exec/s: 30 rss: 68Mb L: 63/75 MS: 1 InsertRepeatedBytes- 00:08:38.455 [2024-11-28 07:33:49.204844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.455 [2024-11-28 07:33:49.204871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.455 [2024-11-28 07:33:49.204910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803460719580180 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.455 [2024-11-28 07:33:49.204925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.455 [2024-11-28 07:33:49.204979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1446804466078848020 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.455 [2024-11-28 07:33:49.204994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.713 #31 NEW cov: 11901 ft: 13819 corp: 23/1420b lim: 100 exec/s: 31 rss: 68Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:38.713 [2024-11-28 07:33:49.244973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.245000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.245038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.245054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.245108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:360287970206416895 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.245123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.714 #32 NEW cov: 11901 ft: 13827 corp: 24/1496b lim: 100 exec/s: 32 rss: 68Mb L: 76/79 MS: 1 PersAutoDict- DE: "\005\000\000\000\000\000\000\000"- 00:08:38.714 [2024-11-28 07:33:49.285065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.285093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.285149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.285166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.285222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.285237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.714 #33 NEW cov: 11901 ft: 13836 corp: 25/1564b lim: 100 exec/s: 33 rss: 68Mb L: 68/79 MS: 1 ChangeByte- 00:08:38.714 [2024-11-28 07:33:49.315162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.315191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.315243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.315258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.315310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.315324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.714 #34 NEW cov: 11901 ft: 13901 corp: 26/1632b lim: 100 exec/s: 34 rss: 68Mb L: 68/79 MS: 1 PersAutoDict- DE: "\005\000\000\000\000\000\000\000"- 00:08:38.714 [2024-11-28 07:33:49.355441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.355469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.355517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2738188573441261338 len:65528 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.355533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.355586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.355603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.355659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.355674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.714 #35 NEW cov: 11901 ft: 14273 corp: 27/1731b lim: 100 exec/s: 35 rss: 68Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:38.714 [2024-11-28 07:33:49.395424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.395452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.395491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1229782941043242935 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.395506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.395560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1229782941032321297 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.395578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.714 #36 NEW cov: 11901 ft: 14294 corp: 28/1792b lim: 100 exec/s: 36 rss: 68Mb L: 61/99 MS: 1 CopyPart- 00:08:38.714 [2024-11-28 07:33:49.435551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.435579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.435625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.435643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.435698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.435714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.714 #37 NEW cov: 11901 ft: 14308 corp: 29/1860b lim: 100 exec/s: 37 rss: 68Mb L: 68/99 MS: 1 ShuffleBytes- 00:08:38.714 [2024-11-28 07:33:49.475678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.475705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.475761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.475777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.714 [2024-11-28 07:33:49.475831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:24832 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.714 [2024-11-28 07:33:49.475846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.974 #38 NEW cov: 11901 ft: 14324 corp: 30/1929b lim: 100 exec/s: 38 rss: 68Mb L: 69/99 MS: 1 InsertByte- 00:08:38.974 [2024-11-28 07:33:49.515780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.515807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.515844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.515859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.515914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.515929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.974 #39 NEW cov: 11901 ft: 14354 corp: 31/1997b lim: 100 exec/s: 39 rss: 68Mb L: 68/99 MS: 1 ShuffleBytes- 00:08:38.974 [2024-11-28 07:33:49.545961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.545987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.546066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803460719580180 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.546083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.546137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18380338055674598420 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.546151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.546206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.546222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.974 #40 NEW cov: 11901 ft: 14360 corp: 32/2077b lim: 100 exec/s: 40 rss: 68Mb L: 80/99 MS: 1 InsertByte- 00:08:38.974 [2024-11-28 07:33:49.586012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.586038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.586090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.586106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.586159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446462598732840959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.586174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.974 #41 NEW cov: 11901 ft: 14367 corp: 33/2145b lim: 100 exec/s: 41 rss: 68Mb L: 68/99 MS: 1 ChangeBit- 00:08:38.974 [2024-11-28 07:33:49.626123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.626150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.626188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:5906 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.626205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.626259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.626273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.974 #42 NEW cov: 11901 ft: 14394 corp: 34/2208b lim: 100 exec/s: 42 rss: 68Mb L: 63/99 MS: 1 ChangeBinInt- 00:08:38.974 [2024-11-28 07:33:49.666085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.666112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.666166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.666182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.974 #43 NEW cov: 11901 ft: 14408 corp: 35/2253b lim: 100 exec/s: 43 rss: 69Mb L: 45/99 MS: 1 ChangeBit- 00:08:38.974 [2024-11-28 07:33:49.706272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:452919552 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.706298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.706346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743747292037119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.706361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.974 [2024-11-28 07:33:49.706413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.974 [2024-11-28 07:33:49.706428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.974 #44 NEW cov: 11901 ft: 14427 corp: 36/2321b lim: 100 exec/s: 44 rss: 69Mb L: 68/99 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:39.233 [2024-11-28 07:33:49.746307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.746335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.746371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.746386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.233 #45 NEW cov: 11901 ft: 14460 corp: 37/2375b lim: 100 exec/s: 45 rss: 69Mb L: 54/99 MS: 1 CMP- DE: "\363\000"- 00:08:39.233 [2024-11-28 07:33:49.786569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.786596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.786659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.786675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.786727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.786743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.233 #46 NEW cov: 11901 ft: 14519 corp: 38/2443b lim: 100 exec/s: 46 rss: 69Mb L: 68/99 MS: 1 ChangeBinInt- 00:08:39.233 [2024-11-28 07:33:49.826704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069853282303 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.826732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.826781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1446803460719580180 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.826797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.826849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1446804466078848020 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.826867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.233 #47 NEW cov: 11901 ft: 14556 corp: 39/2522b lim: 100 exec/s: 47 rss: 69Mb L: 79/99 MS: 1 ShuffleBytes- 00:08:39.233 [2024-11-28 07:33:49.866608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.866635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.866685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1229782941043242935 len:47032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.866699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.233 #48 NEW cov: 11901 ft: 14634 corp: 40/2577b lim: 100 exec/s: 48 rss: 69Mb L: 55/99 MS: 1 EraseBytes- 00:08:39.233 [2024-11-28 07:33:49.906960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.906988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.907025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.907040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.907092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1229782938247303441 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.907108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.233 #49 NEW cov: 11901 ft: 14645 corp: 41/2638b lim: 100 exec/s: 49 rss: 69Mb L: 61/99 MS: 1 ShuffleBytes- 00:08:39.233 [2024-11-28 07:33:49.947032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1229782938134188305 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.947059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.947101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.947116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.233 [2024-11-28 07:33:49.947168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1229782938247303441 len:4370 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.947183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.233 #50 NEW cov: 11901 ft: 14661 corp: 42/2707b lim: 100 exec/s: 50 rss: 70Mb L: 69/99 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:39.233 [2024-11-28 07:33:49.987072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057633131331583 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.233 [2024-11-28 07:33:49.987100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.234 [2024-11-28 07:33:49.987163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.234 [2024-11-28 07:33:49.987180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.234 [2024-11-28 07:33:49.987236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.234 [2024-11-28 07:33:49.987251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.493 #56 NEW cov: 11901 ft: 14750 corp: 43/2776b lim: 100 exec/s: 56 rss: 70Mb L: 69/99 MS: 1 InsertByte- 00:08:39.493 [2024-11-28 07:33:50.027442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.027472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.493 [2024-11-28 07:33:50.027511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.027528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.493 [2024-11-28 07:33:50.027579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.027594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.493 [2024-11-28 07:33:50.027654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:281474959867904 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.027669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.493 #57 NEW cov: 11901 ft: 14757 corp: 44/2866b lim: 100 exec/s: 57 rss: 70Mb L: 90/99 MS: 1 InsertRepeatedBytes- 00:08:39.493 [2024-11-28 07:33:50.077449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.077478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.493 [2024-11-28 07:33:50.077516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.077532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.493 [2024-11-28 07:33:50.077585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.493 [2024-11-28 07:33:50.077604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.493 #58 NEW cov: 11901 ft: 14769 corp: 45/2934b lim: 100 exec/s: 29 rss: 70Mb L: 68/99 MS: 1 CopyPart- 00:08:39.493 #58 DONE cov: 11901 ft: 14769 corp: 45/2934b lim: 100 exec/s: 29 rss: 70Mb 00:08:39.493 ###### Recommended dictionary. ###### 00:08:39.493 "\005\000\000\000\000\000\000\000" # Uses: 3 00:08:39.493 "\001\000" # Uses: 1 00:08:39.493 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:39.493 "\363\000" # Uses: 2 00:08:39.493 ###### End of recommended dictionary. ###### 00:08:39.493 Done 58 runs in 2 second(s) 00:08:39.493 07:33:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:39.493 07:33:50 -- ../common.sh@72 -- # (( i++ )) 00:08:39.493 07:33:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.493 07:33:50 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:39.493 00:08:39.493 real 1m2.385s 00:08:39.493 user 1m38.842s 00:08:39.493 sys 0m7.229s 00:08:39.493 07:33:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:39.493 07:33:50 -- common/autotest_common.sh@10 -- # set +x 00:08:39.493 ************************************ 00:08:39.493 END TEST nvmf_fuzz 00:08:39.493 ************************************ 00:08:39.493 07:33:50 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:39.493 07:33:50 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:39.493 07:33:50 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:39.493 07:33:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:39.493 07:33:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:39.493 07:33:50 -- common/autotest_common.sh@10 -- # set +x 00:08:39.493 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 1106: kill: (1594071) - No such process 00:08:39.493 ************************************ 00:08:39.493 START TEST vfio_fuzz 00:08:39.493 ************************************ 00:08:39.493 07:33:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:39.754 * Looking for test storage... 00:08:39.754 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:39.754 07:33:50 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:39.754 07:33:50 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:39.754 07:33:50 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:39.754 07:33:50 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:39.754 07:33:50 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:39.754 07:33:50 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:39.754 07:33:50 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:39.754 07:33:50 -- scripts/common.sh@335 -- # IFS=.-: 00:08:39.754 07:33:50 -- scripts/common.sh@335 -- # read -ra ver1 00:08:39.754 07:33:50 -- scripts/common.sh@336 -- # IFS=.-: 00:08:39.754 07:33:50 -- scripts/common.sh@336 -- # read -ra ver2 00:08:39.754 07:33:50 -- scripts/common.sh@337 -- # local 'op=<' 00:08:39.754 07:33:50 -- scripts/common.sh@339 -- # ver1_l=2 00:08:39.754 07:33:50 -- scripts/common.sh@340 -- # ver2_l=1 00:08:39.754 07:33:50 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:39.754 07:33:50 -- scripts/common.sh@343 -- # case "$op" in 00:08:39.754 07:33:50 -- scripts/common.sh@344 -- # : 1 00:08:39.754 07:33:50 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:39.754 07:33:50 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:39.754 07:33:50 -- scripts/common.sh@364 -- # decimal 1 00:08:39.754 07:33:50 -- scripts/common.sh@352 -- # local d=1 00:08:39.754 07:33:50 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:39.754 07:33:50 -- scripts/common.sh@354 -- # echo 1 00:08:39.754 07:33:50 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:39.754 07:33:50 -- scripts/common.sh@365 -- # decimal 2 00:08:39.754 07:33:50 -- scripts/common.sh@352 -- # local d=2 00:08:39.754 07:33:50 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:39.754 07:33:50 -- scripts/common.sh@354 -- # echo 2 00:08:39.754 07:33:50 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:39.754 07:33:50 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:39.754 07:33:50 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:39.754 07:33:50 -- scripts/common.sh@367 -- # return 0 00:08:39.754 07:33:50 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:39.754 07:33:50 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:39.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:39.754 --rc genhtml_branch_coverage=1 00:08:39.754 --rc genhtml_function_coverage=1 00:08:39.754 --rc genhtml_legend=1 00:08:39.754 --rc geninfo_all_blocks=1 00:08:39.754 --rc geninfo_unexecuted_blocks=1 00:08:39.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:39.754 ' 00:08:39.754 07:33:50 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:39.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:39.754 --rc genhtml_branch_coverage=1 00:08:39.754 --rc genhtml_function_coverage=1 00:08:39.754 --rc genhtml_legend=1 00:08:39.754 --rc geninfo_all_blocks=1 00:08:39.754 --rc geninfo_unexecuted_blocks=1 00:08:39.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:39.754 ' 00:08:39.754 07:33:50 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:39.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:39.754 --rc genhtml_branch_coverage=1 00:08:39.754 --rc genhtml_function_coverage=1 00:08:39.754 --rc genhtml_legend=1 00:08:39.754 --rc geninfo_all_blocks=1 00:08:39.754 --rc geninfo_unexecuted_blocks=1 00:08:39.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:39.754 ' 00:08:39.754 07:33:50 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:39.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:39.754 --rc genhtml_branch_coverage=1 00:08:39.754 --rc genhtml_function_coverage=1 00:08:39.754 --rc genhtml_legend=1 00:08:39.754 --rc geninfo_all_blocks=1 00:08:39.754 --rc geninfo_unexecuted_blocks=1 00:08:39.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:39.754 ' 00:08:39.754 07:33:50 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:39.754 07:33:50 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:39.754 07:33:50 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:39.754 07:33:50 -- common/autotest_common.sh@34 -- # set -e 00:08:39.754 07:33:50 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:39.754 07:33:50 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:39.754 07:33:50 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:39.754 07:33:50 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:39.754 07:33:50 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:39.754 07:33:50 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:39.754 07:33:50 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:39.754 07:33:50 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:39.754 07:33:50 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:39.754 07:33:50 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:39.754 07:33:50 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:39.754 07:33:50 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:39.754 07:33:50 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:39.754 07:33:50 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:39.754 07:33:50 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:39.754 07:33:50 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:39.754 07:33:50 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:39.754 07:33:50 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:39.754 07:33:50 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:39.754 07:33:50 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:39.754 07:33:50 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:39.754 07:33:50 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:39.754 07:33:50 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:39.754 07:33:50 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:39.754 07:33:50 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:39.754 07:33:50 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:39.754 07:33:50 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:39.754 07:33:50 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:39.754 07:33:50 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:39.754 07:33:50 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:39.754 07:33:50 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:39.754 07:33:50 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:39.754 07:33:50 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:39.754 07:33:50 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:39.754 07:33:50 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:39.754 07:33:50 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:39.754 07:33:50 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:39.754 07:33:50 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:39.754 07:33:50 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:39.755 07:33:50 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:39.755 07:33:50 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:39.755 07:33:50 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:39.755 07:33:50 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:39.755 07:33:50 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:39.755 07:33:50 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:39.755 07:33:50 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:39.755 07:33:50 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:39.755 07:33:50 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:39.755 07:33:50 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:39.755 07:33:50 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:39.755 07:33:50 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:39.755 07:33:50 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:39.755 07:33:50 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:39.755 07:33:50 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:39.755 07:33:50 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:39.755 07:33:50 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:39.755 07:33:50 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:39.755 07:33:50 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:39.755 07:33:50 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:39.755 07:33:50 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:39.755 07:33:50 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:39.755 07:33:50 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:39.755 07:33:50 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:39.755 07:33:50 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:39.755 07:33:50 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:39.755 07:33:50 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:39.755 07:33:50 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:39.755 07:33:50 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:39.755 07:33:50 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:39.755 07:33:50 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:39.755 07:33:50 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:39.755 07:33:50 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:39.755 07:33:50 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:39.755 07:33:50 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:39.755 07:33:50 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:39.755 07:33:50 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:39.755 07:33:50 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:39.755 07:33:50 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:39.755 07:33:50 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:39.755 07:33:50 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:39.755 07:33:50 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:39.755 07:33:50 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:39.755 07:33:50 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:39.755 07:33:50 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:39.755 07:33:50 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:39.755 07:33:50 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:39.755 07:33:50 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:39.755 07:33:50 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:39.755 07:33:50 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:39.755 07:33:50 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:39.755 07:33:50 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:39.755 07:33:50 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:39.755 07:33:50 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:39.755 07:33:50 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:39.755 07:33:50 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:39.755 07:33:50 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:39.755 07:33:50 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:39.755 07:33:50 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:39.755 07:33:50 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:39.755 #define SPDK_CONFIG_H 00:08:39.755 #define SPDK_CONFIG_APPS 1 00:08:39.755 #define SPDK_CONFIG_ARCH native 00:08:39.755 #undef SPDK_CONFIG_ASAN 00:08:39.755 #undef SPDK_CONFIG_AVAHI 00:08:39.755 #undef SPDK_CONFIG_CET 00:08:39.755 #define SPDK_CONFIG_COVERAGE 1 00:08:39.755 #define SPDK_CONFIG_CROSS_PREFIX 00:08:39.755 #undef SPDK_CONFIG_CRYPTO 00:08:39.755 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:39.755 #undef SPDK_CONFIG_CUSTOMOCF 00:08:39.755 #undef SPDK_CONFIG_DAOS 00:08:39.755 #define SPDK_CONFIG_DAOS_DIR 00:08:39.755 #define SPDK_CONFIG_DEBUG 1 00:08:39.755 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:39.755 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:39.755 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:39.755 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:39.755 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:39.755 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:39.755 #define SPDK_CONFIG_EXAMPLES 1 00:08:39.755 #undef SPDK_CONFIG_FC 00:08:39.755 #define SPDK_CONFIG_FC_PATH 00:08:39.755 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:39.755 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:39.755 #undef SPDK_CONFIG_FUSE 00:08:39.755 #define SPDK_CONFIG_FUZZER 1 00:08:39.755 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:39.755 #undef SPDK_CONFIG_GOLANG 00:08:39.755 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:39.755 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:39.755 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:39.755 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:39.755 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:39.755 #define SPDK_CONFIG_IDXD 1 00:08:39.755 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:39.755 #undef SPDK_CONFIG_IPSEC_MB 00:08:39.755 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:39.755 #define SPDK_CONFIG_ISAL 1 00:08:39.755 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:39.755 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:39.755 #define SPDK_CONFIG_LIBDIR 00:08:39.755 #undef SPDK_CONFIG_LTO 00:08:39.755 #define SPDK_CONFIG_MAX_LCORES 00:08:39.755 #define SPDK_CONFIG_NVME_CUSE 1 00:08:39.755 #undef SPDK_CONFIG_OCF 00:08:39.755 #define SPDK_CONFIG_OCF_PATH 00:08:39.755 #define SPDK_CONFIG_OPENSSL_PATH 00:08:39.755 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:39.755 #undef SPDK_CONFIG_PGO_USE 00:08:39.755 #define SPDK_CONFIG_PREFIX /usr/local 00:08:39.755 #undef SPDK_CONFIG_RAID5F 00:08:39.755 #undef SPDK_CONFIG_RBD 00:08:39.755 #define SPDK_CONFIG_RDMA 1 00:08:39.755 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:39.755 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:39.755 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:39.755 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:39.755 #undef SPDK_CONFIG_SHARED 00:08:39.755 #undef SPDK_CONFIG_SMA 00:08:39.755 #define SPDK_CONFIG_TESTS 1 00:08:39.755 #undef SPDK_CONFIG_TSAN 00:08:39.755 #define SPDK_CONFIG_UBLK 1 00:08:39.755 #define SPDK_CONFIG_UBSAN 1 00:08:39.755 #undef SPDK_CONFIG_UNIT_TESTS 00:08:39.755 #undef SPDK_CONFIG_URING 00:08:39.755 #define SPDK_CONFIG_URING_PATH 00:08:39.755 #undef SPDK_CONFIG_URING_ZNS 00:08:39.755 #undef SPDK_CONFIG_USDT 00:08:39.755 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:39.755 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:39.755 #define SPDK_CONFIG_VFIO_USER 1 00:08:39.755 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:39.755 #define SPDK_CONFIG_VHOST 1 00:08:39.755 #define SPDK_CONFIG_VIRTIO 1 00:08:39.755 #undef SPDK_CONFIG_VTUNE 00:08:39.755 #define SPDK_CONFIG_VTUNE_DIR 00:08:39.755 #define SPDK_CONFIG_WERROR 1 00:08:39.755 #define SPDK_CONFIG_WPDK_DIR 00:08:39.755 #undef SPDK_CONFIG_XNVME 00:08:39.755 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:39.755 07:33:50 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:39.755 07:33:50 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:39.755 07:33:50 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:39.755 07:33:50 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:39.755 07:33:50 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:39.755 07:33:50 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:39.755 07:33:50 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:39.755 07:33:50 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:39.755 07:33:50 -- paths/export.sh@5 -- # export PATH 00:08:39.756 07:33:50 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:39.756 07:33:50 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:39.756 07:33:50 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:39.756 07:33:50 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:39.756 07:33:50 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:39.756 07:33:50 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:39.756 07:33:50 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:39.756 07:33:50 -- pm/common@16 -- # TEST_TAG=N/A 00:08:39.756 07:33:50 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:39.756 07:33:50 -- common/autotest_common.sh@52 -- # : 1 00:08:39.756 07:33:50 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:39.756 07:33:50 -- common/autotest_common.sh@56 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:39.756 07:33:50 -- common/autotest_common.sh@58 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:39.756 07:33:50 -- common/autotest_common.sh@60 -- # : 1 00:08:39.756 07:33:50 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:39.756 07:33:50 -- common/autotest_common.sh@62 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:39.756 07:33:50 -- common/autotest_common.sh@64 -- # : 00:08:39.756 07:33:50 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:39.756 07:33:50 -- common/autotest_common.sh@66 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:39.756 07:33:50 -- common/autotest_common.sh@68 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:39.756 07:33:50 -- common/autotest_common.sh@70 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:39.756 07:33:50 -- common/autotest_common.sh@72 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:39.756 07:33:50 -- common/autotest_common.sh@74 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:39.756 07:33:50 -- common/autotest_common.sh@76 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:39.756 07:33:50 -- common/autotest_common.sh@78 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:39.756 07:33:50 -- common/autotest_common.sh@80 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:39.756 07:33:50 -- common/autotest_common.sh@82 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:39.756 07:33:50 -- common/autotest_common.sh@84 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:39.756 07:33:50 -- common/autotest_common.sh@86 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:39.756 07:33:50 -- common/autotest_common.sh@88 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:39.756 07:33:50 -- common/autotest_common.sh@90 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:39.756 07:33:50 -- common/autotest_common.sh@92 -- # : 1 00:08:39.756 07:33:50 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:39.756 07:33:50 -- common/autotest_common.sh@94 -- # : 1 00:08:39.756 07:33:50 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:39.756 07:33:50 -- common/autotest_common.sh@96 -- # : rdma 00:08:39.756 07:33:50 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:39.756 07:33:50 -- common/autotest_common.sh@98 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:39.756 07:33:50 -- common/autotest_common.sh@100 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:39.756 07:33:50 -- common/autotest_common.sh@102 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:39.756 07:33:50 -- common/autotest_common.sh@104 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:39.756 07:33:50 -- common/autotest_common.sh@106 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:39.756 07:33:50 -- common/autotest_common.sh@108 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:39.756 07:33:50 -- common/autotest_common.sh@110 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:39.756 07:33:50 -- common/autotest_common.sh@112 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:39.756 07:33:50 -- common/autotest_common.sh@114 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:39.756 07:33:50 -- common/autotest_common.sh@116 -- # : 1 00:08:39.756 07:33:50 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:39.756 07:33:50 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:39.756 07:33:50 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:39.756 07:33:50 -- common/autotest_common.sh@120 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:39.756 07:33:50 -- common/autotest_common.sh@122 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:39.756 07:33:50 -- common/autotest_common.sh@124 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:39.756 07:33:50 -- common/autotest_common.sh@126 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:39.756 07:33:50 -- common/autotest_common.sh@128 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:39.756 07:33:50 -- common/autotest_common.sh@130 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:39.756 07:33:50 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:39.756 07:33:50 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:39.756 07:33:50 -- common/autotest_common.sh@134 -- # : true 00:08:39.756 07:33:50 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:39.756 07:33:50 -- common/autotest_common.sh@136 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:39.756 07:33:50 -- common/autotest_common.sh@138 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:39.756 07:33:50 -- common/autotest_common.sh@140 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:39.756 07:33:50 -- common/autotest_common.sh@142 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:39.756 07:33:50 -- common/autotest_common.sh@144 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:39.756 07:33:50 -- common/autotest_common.sh@146 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:39.756 07:33:50 -- common/autotest_common.sh@148 -- # : 00:08:39.756 07:33:50 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:39.756 07:33:50 -- common/autotest_common.sh@150 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:39.756 07:33:50 -- common/autotest_common.sh@152 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:39.756 07:33:50 -- common/autotest_common.sh@154 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:39.756 07:33:50 -- common/autotest_common.sh@156 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:39.756 07:33:50 -- common/autotest_common.sh@158 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:39.756 07:33:50 -- common/autotest_common.sh@160 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:39.756 07:33:50 -- common/autotest_common.sh@163 -- # : 00:08:39.756 07:33:50 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:39.756 07:33:50 -- common/autotest_common.sh@165 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:39.756 07:33:50 -- common/autotest_common.sh@167 -- # : 0 00:08:39.756 07:33:50 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:39.756 07:33:50 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:39.756 07:33:50 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:39.756 07:33:50 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:39.756 07:33:50 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:39.756 07:33:50 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:39.756 07:33:50 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:39.756 07:33:50 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:39.757 07:33:50 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:39.757 07:33:50 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:39.757 07:33:50 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:39.757 07:33:50 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:39.757 07:33:50 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:39.757 07:33:50 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:39.757 07:33:50 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:39.757 07:33:50 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:39.757 07:33:50 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:39.757 07:33:50 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:39.757 07:33:50 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:39.757 07:33:50 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:39.757 07:33:50 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:39.757 07:33:50 -- common/autotest_common.sh@196 -- # cat 00:08:39.757 07:33:50 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:39.757 07:33:50 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:39.757 07:33:50 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:39.757 07:33:50 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:39.757 07:33:50 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:39.757 07:33:50 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:39.757 07:33:50 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:39.757 07:33:50 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:39.757 07:33:50 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:39.757 07:33:50 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:39.757 07:33:50 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:39.757 07:33:50 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:39.757 07:33:50 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:39.757 07:33:50 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:39.757 07:33:50 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:39.757 07:33:50 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:39.757 07:33:50 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:39.757 07:33:50 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:39.757 07:33:50 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:39.757 07:33:50 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:39.757 07:33:50 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:39.757 07:33:50 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:39.757 07:33:50 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:39.757 07:33:50 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:39.757 07:33:50 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:39.757 07:33:50 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:39.757 07:33:50 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:39.757 07:33:50 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:39.757 07:33:50 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:39.757 07:33:50 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:39.757 07:33:50 -- common/autotest_common.sh@259 -- # valgrind= 00:08:39.757 07:33:50 -- common/autotest_common.sh@265 -- # uname -s 00:08:39.757 07:33:50 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:39.757 07:33:50 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:39.757 07:33:50 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:39.757 07:33:50 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:39.757 07:33:50 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:39.757 07:33:50 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:39.757 07:33:50 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:39.757 07:33:50 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:39.757 07:33:50 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:39.757 07:33:50 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:39.757 07:33:50 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:39.757 07:33:50 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:39.757 07:33:50 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:39.757 07:33:50 -- common/autotest_common.sh@319 -- # [[ -z 1661989 ]] 00:08:39.757 07:33:50 -- common/autotest_common.sh@319 -- # kill -0 1661989 00:08:39.757 07:33:50 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:39.757 07:33:50 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:39.757 07:33:50 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:39.757 07:33:50 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:39.757 07:33:50 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:39.757 07:33:50 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:39.757 07:33:50 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:39.757 07:33:50 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:39.757 07:33:50 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.HPJygS 00:08:40.017 07:33:50 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:40.017 07:33:50 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:40.017 07:33:50 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:40.017 07:33:50 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.HPJygS/tests/vfio /tmp/spdk.HPJygS 00:08:40.017 07:33:50 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@328 -- # df -T 00:08:40.017 07:33:50 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=51907637248 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=9822969856 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863560704 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=1744896 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:40.017 07:33:50 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:40.017 07:33:50 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:40.017 07:33:50 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:40.017 07:33:50 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:40.017 * Looking for test storage... 00:08:40.017 07:33:50 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:40.017 07:33:50 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:40.017 07:33:50 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.017 07:33:50 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:40.017 07:33:50 -- common/autotest_common.sh@373 -- # mount=/ 00:08:40.017 07:33:50 -- common/autotest_common.sh@375 -- # target_space=51907637248 00:08:40.017 07:33:50 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:40.017 07:33:50 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:40.017 07:33:50 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:40.017 07:33:50 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:40.017 07:33:50 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:40.017 07:33:50 -- common/autotest_common.sh@382 -- # new_size=12037562368 00:08:40.017 07:33:50 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:40.017 07:33:50 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.017 07:33:50 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.017 07:33:50 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.017 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.017 07:33:50 -- common/autotest_common.sh@390 -- # return 0 00:08:40.017 07:33:50 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:40.017 07:33:50 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:40.017 07:33:50 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:40.018 07:33:50 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:40.018 07:33:50 -- common/autotest_common.sh@1682 -- # true 00:08:40.018 07:33:50 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:40.018 07:33:50 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:40.018 07:33:50 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:40.018 07:33:50 -- common/autotest_common.sh@27 -- # exec 00:08:40.018 07:33:50 -- common/autotest_common.sh@29 -- # exec 00:08:40.018 07:33:50 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:40.018 07:33:50 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:40.018 07:33:50 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:40.018 07:33:50 -- common/autotest_common.sh@18 -- # set -x 00:08:40.018 07:33:50 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:40.018 07:33:50 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:40.018 07:33:50 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:40.018 07:33:50 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:40.018 07:33:50 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:40.018 07:33:50 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:40.018 07:33:50 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:40.018 07:33:50 -- scripts/common.sh@335 -- # IFS=.-: 00:08:40.018 07:33:50 -- scripts/common.sh@335 -- # read -ra ver1 00:08:40.018 07:33:50 -- scripts/common.sh@336 -- # IFS=.-: 00:08:40.018 07:33:50 -- scripts/common.sh@336 -- # read -ra ver2 00:08:40.018 07:33:50 -- scripts/common.sh@337 -- # local 'op=<' 00:08:40.018 07:33:50 -- scripts/common.sh@339 -- # ver1_l=2 00:08:40.018 07:33:50 -- scripts/common.sh@340 -- # ver2_l=1 00:08:40.018 07:33:50 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:40.018 07:33:50 -- scripts/common.sh@343 -- # case "$op" in 00:08:40.018 07:33:50 -- scripts/common.sh@344 -- # : 1 00:08:40.018 07:33:50 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:40.018 07:33:50 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:40.018 07:33:50 -- scripts/common.sh@364 -- # decimal 1 00:08:40.018 07:33:50 -- scripts/common.sh@352 -- # local d=1 00:08:40.018 07:33:50 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:40.018 07:33:50 -- scripts/common.sh@354 -- # echo 1 00:08:40.018 07:33:50 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:40.018 07:33:50 -- scripts/common.sh@365 -- # decimal 2 00:08:40.018 07:33:50 -- scripts/common.sh@352 -- # local d=2 00:08:40.018 07:33:50 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:40.018 07:33:50 -- scripts/common.sh@354 -- # echo 2 00:08:40.018 07:33:50 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:40.018 07:33:50 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:40.018 07:33:50 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:40.018 07:33:50 -- scripts/common.sh@367 -- # return 0 00:08:40.018 07:33:50 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:40.018 07:33:50 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:40.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.018 --rc genhtml_branch_coverage=1 00:08:40.018 --rc genhtml_function_coverage=1 00:08:40.018 --rc genhtml_legend=1 00:08:40.018 --rc geninfo_all_blocks=1 00:08:40.018 --rc geninfo_unexecuted_blocks=1 00:08:40.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.018 ' 00:08:40.018 07:33:50 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:40.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.018 --rc genhtml_branch_coverage=1 00:08:40.018 --rc genhtml_function_coverage=1 00:08:40.018 --rc genhtml_legend=1 00:08:40.018 --rc geninfo_all_blocks=1 00:08:40.018 --rc geninfo_unexecuted_blocks=1 00:08:40.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.018 ' 00:08:40.018 07:33:50 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:40.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.018 --rc genhtml_branch_coverage=1 00:08:40.018 --rc genhtml_function_coverage=1 00:08:40.018 --rc genhtml_legend=1 00:08:40.018 --rc geninfo_all_blocks=1 00:08:40.018 --rc geninfo_unexecuted_blocks=1 00:08:40.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.018 ' 00:08:40.018 07:33:50 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:40.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.018 --rc genhtml_branch_coverage=1 00:08:40.018 --rc genhtml_function_coverage=1 00:08:40.018 --rc genhtml_legend=1 00:08:40.018 --rc geninfo_all_blocks=1 00:08:40.018 --rc geninfo_unexecuted_blocks=1 00:08:40.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.018 ' 00:08:40.018 07:33:50 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:40.018 07:33:50 -- ../common.sh@8 -- # pids=() 00:08:40.018 07:33:50 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:40.018 07:33:50 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:40.018 07:33:50 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:40.018 07:33:50 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:40.018 07:33:50 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:40.018 07:33:50 -- vfio/run.sh@65 -- # mem_size=0 00:08:40.018 07:33:50 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:40.018 07:33:50 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:40.018 07:33:50 -- ../common.sh@69 -- # local fuzz_num=7 00:08:40.018 07:33:50 -- ../common.sh@70 -- # local time=1 00:08:40.018 07:33:50 -- ../common.sh@72 -- # (( i = 0 )) 00:08:40.018 07:33:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.018 07:33:50 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:40.018 07:33:50 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:40.018 07:33:50 -- vfio/run.sh@23 -- # local timen=1 00:08:40.018 07:33:50 -- vfio/run.sh@24 -- # local core=0x1 00:08:40.018 07:33:50 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:40.018 07:33:50 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:40.018 07:33:50 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:40.018 07:33:50 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:40.018 07:33:50 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:40.018 07:33:50 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:40.018 07:33:50 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:40.018 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:40.018 07:33:50 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:40.018 [2024-11-28 07:33:50.701316] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:40.018 [2024-11-28 07:33:50.701391] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662051 ] 00:08:40.018 EAL: No free 2048 kB hugepages reported on node 1 00:08:40.018 [2024-11-28 07:33:50.772237] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.277 [2024-11-28 07:33:50.809141] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:40.277 [2024-11-28 07:33:50.809280] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.277 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.277 INFO: Seed: 2020036088 00:08:40.277 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:40.277 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:40.277 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:40.277 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.277 #2 INITED exec/s: 0 rss: 60Mb 00:08:40.277 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.277 This may also happen if the target rejected all inputs we tried so far 00:08:40.796 NEW_FUNC[1/629]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:40.796 NEW_FUNC[2/629]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:40.796 #11 NEW cov: 10737 ft: 10730 corp: 2/45b lim: 60 exec/s: 0 rss: 65Mb L: 44/44 MS: 4 InsertByte-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:41.055 NEW_FUNC[1/2]: 0x169ee78 in nvme_qpair_submit_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:1089 00:08:41.055 NEW_FUNC[2/2]: 0x16a04d8 in _nvme_qpair_submit_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:943 00:08:41.055 #12 NEW cov: 10776 ft: 14708 corp: 3/89b lim: 60 exec/s: 0 rss: 67Mb L: 44/44 MS: 1 ChangeBinInt- 00:08:41.314 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:41.314 #16 NEW cov: 10793 ft: 15617 corp: 4/133b lim: 60 exec/s: 0 rss: 68Mb L: 44/44 MS: 4 ChangeBinInt-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:41.314 #17 NEW cov: 10796 ft: 16143 corp: 5/176b lim: 60 exec/s: 17 rss: 68Mb L: 43/44 MS: 1 EraseBytes- 00:08:41.574 #18 NEW cov: 10796 ft: 16691 corp: 6/217b lim: 60 exec/s: 18 rss: 68Mb L: 41/44 MS: 1 EraseBytes- 00:08:41.833 #19 NEW cov: 10796 ft: 16953 corp: 7/274b lim: 60 exec/s: 19 rss: 68Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:08:41.833 #20 NEW cov: 10796 ft: 17229 corp: 8/322b lim: 60 exec/s: 20 rss: 68Mb L: 48/57 MS: 1 InsertRepeatedBytes- 00:08:42.092 #21 NEW cov: 10796 ft: 17502 corp: 9/356b lim: 60 exec/s: 21 rss: 68Mb L: 34/57 MS: 1 EraseBytes- 00:08:42.350 #22 NEW cov: 10803 ft: 17658 corp: 10/413b lim: 60 exec/s: 22 rss: 68Mb L: 57/57 MS: 1 ChangeBinInt- 00:08:42.350 #23 NEW cov: 10803 ft: 17686 corp: 11/467b lim: 60 exec/s: 11 rss: 68Mb L: 54/57 MS: 1 EraseBytes- 00:08:42.351 #23 DONE cov: 10803 ft: 17686 corp: 11/467b lim: 60 exec/s: 11 rss: 68Mb 00:08:42.351 Done 23 runs in 2 second(s) 00:08:42.610 07:33:53 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:42.610 07:33:53 -- ../common.sh@72 -- # (( i++ )) 00:08:42.610 07:33:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.610 07:33:53 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:42.610 07:33:53 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:42.610 07:33:53 -- vfio/run.sh@23 -- # local timen=1 00:08:42.610 07:33:53 -- vfio/run.sh@24 -- # local core=0x1 00:08:42.610 07:33:53 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:42.610 07:33:53 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:42.610 07:33:53 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:42.610 07:33:53 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:42.610 07:33:53 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:42.610 07:33:53 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:42.610 07:33:53 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:42.610 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:42.610 07:33:53 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:42.610 [2024-11-28 07:33:53.379276] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:42.610 [2024-11-28 07:33:53.379343] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662098 ] 00:08:42.870 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.870 [2024-11-28 07:33:53.450003] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.870 [2024-11-28 07:33:53.485702] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:42.870 [2024-11-28 07:33:53.485862] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.128 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.128 INFO: Seed: 401059964 00:08:43.128 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:43.128 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:43.129 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:43.129 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.129 #2 INITED exec/s: 0 rss: 60Mb 00:08:43.129 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.129 This may also happen if the target rejected all inputs we tried so far 00:08:43.129 [2024-11-28 07:33:53.767639] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.129 [2024-11-28 07:33:53.767674] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.129 [2024-11-28 07:33:53.767692] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.387 NEW_FUNC[1/636]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:43.387 NEW_FUNC[2/636]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:43.387 #19 NEW cov: 10763 ft: 10739 corp: 2/20b lim: 40 exec/s: 0 rss: 65Mb L: 19/19 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:43.646 [2024-11-28 07:33:54.233268] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.646 [2024-11-28 07:33:54.233299] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.646 [2024-11-28 07:33:54.233317] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.646 NEW_FUNC[1/2]: 0x1374b48 in spdk_nvme_opc_get_data_transfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/nvme_spec.h:1728 00:08:43.646 NEW_FUNC[2/2]: 0x1652908 in nvme_payload_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:260 00:08:43.646 #20 NEW cov: 10795 ft: 13121 corp: 3/53b lim: 40 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:43.905 [2024-11-28 07:33:54.431103] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.905 [2024-11-28 07:33:54.431125] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.905 [2024-11-28 07:33:54.431142] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:43.905 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:43.905 #21 NEW cov: 10812 ft: 14969 corp: 4/73b lim: 40 exec/s: 0 rss: 68Mb L: 20/33 MS: 1 InsertByte- 00:08:43.905 [2024-11-28 07:33:54.617720] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:43.905 [2024-11-28 07:33:54.617743] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:43.905 [2024-11-28 07:33:54.617761] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.164 #22 NEW cov: 10812 ft: 15833 corp: 5/106b lim: 40 exec/s: 22 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:44.164 [2024-11-28 07:33:54.801397] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.164 [2024-11-28 07:33:54.801420] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.164 [2024-11-28 07:33:54.801439] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.164 #23 NEW cov: 10812 ft: 16507 corp: 6/126b lim: 40 exec/s: 23 rss: 68Mb L: 20/33 MS: 1 ChangeBinInt- 00:08:44.427 [2024-11-28 07:33:54.988085] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.427 [2024-11-28 07:33:54.988107] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.427 [2024-11-28 07:33:54.988123] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.427 #24 NEW cov: 10812 ft: 16549 corp: 7/146b lim: 40 exec/s: 24 rss: 68Mb L: 20/33 MS: 1 InsertByte- 00:08:44.427 [2024-11-28 07:33:55.173866] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.427 [2024-11-28 07:33:55.173888] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.427 [2024-11-28 07:33:55.173904] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.686 #30 NEW cov: 10812 ft: 16624 corp: 8/166b lim: 40 exec/s: 30 rss: 68Mb L: 20/33 MS: 1 ChangeBit- 00:08:44.686 [2024-11-28 07:33:55.358741] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.686 [2024-11-28 07:33:55.358763] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.686 [2024-11-28 07:33:55.358780] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.945 #31 NEW cov: 10812 ft: 16674 corp: 9/204b lim: 40 exec/s: 31 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:44.945 [2024-11-28 07:33:55.543243] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.945 [2024-11-28 07:33:55.543265] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.945 [2024-11-28 07:33:55.543283] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.945 #32 NEW cov: 10819 ft: 16770 corp: 10/237b lim: 40 exec/s: 32 rss: 68Mb L: 33/38 MS: 1 ChangeBit- 00:08:45.205 [2024-11-28 07:33:55.728605] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.205 [2024-11-28 07:33:55.728627] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.205 [2024-11-28 07:33:55.728645] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.205 #33 NEW cov: 10819 ft: 17157 corp: 11/270b lim: 40 exec/s: 16 rss: 68Mb L: 33/38 MS: 1 ChangeByte- 00:08:45.205 #33 DONE cov: 10819 ft: 17157 corp: 11/270b lim: 40 exec/s: 16 rss: 68Mb 00:08:45.205 Done 33 runs in 2 second(s) 00:08:45.464 07:33:56 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:45.464 07:33:56 -- ../common.sh@72 -- # (( i++ )) 00:08:45.464 07:33:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.464 07:33:56 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:45.464 07:33:56 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:45.464 07:33:56 -- vfio/run.sh@23 -- # local timen=1 00:08:45.464 07:33:56 -- vfio/run.sh@24 -- # local core=0x1 00:08:45.464 07:33:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:45.464 07:33:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:45.464 07:33:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:45.464 07:33:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:45.464 07:33:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:45.464 07:33:56 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:45.464 07:33:56 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:45.464 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:45.464 07:33:56 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:45.464 [2024-11-28 07:33:56.140738] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:45.464 [2024-11-28 07:33:56.140833] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662151 ] 00:08:45.464 EAL: No free 2048 kB hugepages reported on node 1 00:08:45.465 [2024-11-28 07:33:56.211871] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.724 [2024-11-28 07:33:56.247313] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:45.724 [2024-11-28 07:33:56.247470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.724 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.724 INFO: Seed: 3164066093 00:08:45.724 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:45.724 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:45.724 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:45.724 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.724 #2 INITED exec/s: 0 rss: 60Mb 00:08:45.724 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.724 This may also happen if the target rejected all inputs we tried so far 00:08:45.983 [2024-11-28 07:33:56.537251] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.242 NEW_FUNC[1/633]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:46.242 NEW_FUNC[2/633]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:46.242 #14 NEW cov: 10729 ft: 10730 corp: 2/46b lim: 80 exec/s: 0 rss: 65Mb L: 45/45 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:46.242 [2024-11-28 07:33:57.000352] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.501 NEW_FUNC[1/3]: 0x1374b88 in vfio_user_map_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:1680 00:08:46.501 NEW_FUNC[2/3]: 0x1374df8 in nvme_map_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:960 00:08:46.501 #15 NEW cov: 10772 ft: 13627 corp: 3/77b lim: 80 exec/s: 0 rss: 66Mb L: 31/45 MS: 1 EraseBytes- 00:08:46.501 [2024-11-28 07:33:57.187649] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.760 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.760 #16 NEW cov: 10789 ft: 15760 corp: 4/108b lim: 80 exec/s: 0 rss: 67Mb L: 31/45 MS: 1 ChangeBit- 00:08:46.760 [2024-11-28 07:33:57.371515] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:46.760 #17 NEW cov: 10789 ft: 16194 corp: 5/139b lim: 80 exec/s: 17 rss: 67Mb L: 31/45 MS: 1 ChangeBit- 00:08:47.019 [2024-11-28 07:33:57.555473] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.019 #18 NEW cov: 10789 ft: 16705 corp: 6/178b lim: 80 exec/s: 18 rss: 68Mb L: 39/45 MS: 1 InsertRepeatedBytes- 00:08:47.019 [2024-11-28 07:33:57.741164] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.278 #23 NEW cov: 10789 ft: 16973 corp: 7/245b lim: 80 exec/s: 23 rss: 68Mb L: 67/67 MS: 5 InsertByte-ChangeByte-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:47.278 [2024-11-28 07:33:57.934668] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.278 #24 NEW cov: 10789 ft: 17371 corp: 8/275b lim: 80 exec/s: 24 rss: 68Mb L: 30/67 MS: 1 EraseBytes- 00:08:47.537 [2024-11-28 07:33:58.118913] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.537 #28 NEW cov: 10789 ft: 17595 corp: 9/305b lim: 80 exec/s: 28 rss: 68Mb L: 30/67 MS: 4 ChangeByte-ChangeASCIIInt-ChangeASCIIInt-InsertRepeatedBytes- 00:08:47.796 [2024-11-28 07:33:58.312330] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.796 #29 NEW cov: 10796 ft: 17903 corp: 10/336b lim: 80 exec/s: 29 rss: 68Mb L: 31/67 MS: 1 ChangeByte- 00:08:47.796 [2024-11-28 07:33:58.495447] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.055 #30 NEW cov: 10796 ft: 18085 corp: 11/403b lim: 80 exec/s: 15 rss: 68Mb L: 67/67 MS: 1 ChangeBit- 00:08:48.055 #30 DONE cov: 10796 ft: 18085 corp: 11/403b lim: 80 exec/s: 15 rss: 68Mb 00:08:48.055 Done 30 runs in 2 second(s) 00:08:48.315 07:33:58 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:48.315 07:33:58 -- ../common.sh@72 -- # (( i++ )) 00:08:48.315 07:33:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.315 07:33:58 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:48.315 07:33:58 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:48.315 07:33:58 -- vfio/run.sh@23 -- # local timen=1 00:08:48.315 07:33:58 -- vfio/run.sh@24 -- # local core=0x1 00:08:48.315 07:33:58 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:48.315 07:33:58 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:48.315 07:33:58 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:48.315 07:33:58 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:48.315 07:33:58 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:48.315 07:33:58 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:48.315 07:33:58 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:48.315 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:48.315 07:33:58 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:48.315 [2024-11-28 07:33:58.904231] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:48.315 [2024-11-28 07:33:58.904322] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662193 ] 00:08:48.315 EAL: No free 2048 kB hugepages reported on node 1 00:08:48.315 [2024-11-28 07:33:58.975265] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.315 [2024-11-28 07:33:59.010514] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:48.315 [2024-11-28 07:33:59.010695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.574 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.574 INFO: Seed: 1632100499 00:08:48.574 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:48.574 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:48.574 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:48.574 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.574 #2 INITED exec/s: 0 rss: 60Mb 00:08:48.574 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.574 This may also happen if the target rejected all inputs we tried so far 00:08:49.091 NEW_FUNC[1/631]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:49.091 NEW_FUNC[2/631]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:49.091 #8 NEW cov: 10742 ft: 10532 corp: 2/86b lim: 320 exec/s: 0 rss: 66Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:08:49.350 NEW_FUNC[1/1]: 0x1609fa8 in _nvme_md_excluded_from_xfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:54 00:08:49.350 #9 NEW cov: 10762 ft: 13834 corp: 3/171b lim: 320 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 ChangeByte- 00:08:49.350 [2024-11-28 07:33:59.935403] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:49.350 [2024-11-28 07:33:59.935442] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:49.350 [2024-11-28 07:33:59.935453] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:49.350 [2024-11-28 07:33:59.935471] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:49.350 NEW_FUNC[1/7]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:08:49.350 NEW_FUNC[2/7]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:08:49.350 #10 NEW cov: 10811 ft: 14569 corp: 4/267b lim: 320 exec/s: 0 rss: 68Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:49.609 #11 NEW cov: 10811 ft: 15698 corp: 5/401b lim: 320 exec/s: 11 rss: 68Mb L: 134/134 MS: 1 CopyPart- 00:08:49.868 #12 NEW cov: 10811 ft: 16599 corp: 6/511b lim: 320 exec/s: 12 rss: 68Mb L: 110/134 MS: 1 InsertRepeatedBytes- 00:08:49.868 #13 NEW cov: 10811 ft: 16860 corp: 7/562b lim: 320 exec/s: 13 rss: 68Mb L: 51/134 MS: 1 InsertRepeatedBytes- 00:08:50.127 #14 NEW cov: 10811 ft: 17200 corp: 8/672b lim: 320 exec/s: 14 rss: 68Mb L: 110/134 MS: 1 CrossOver- 00:08:50.127 [2024-11-28 07:34:00.861250] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:50.127 [2024-11-28 07:34:00.861283] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:50.127 [2024-11-28 07:34:00.861295] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:50.127 [2024-11-28 07:34:00.861312] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:50.386 #15 NEW cov: 10811 ft: 17294 corp: 9/768b lim: 320 exec/s: 15 rss: 68Mb L: 96/134 MS: 1 ChangeByte- 00:08:50.386 #20 NEW cov: 10818 ft: 17370 corp: 10/871b lim: 320 exec/s: 20 rss: 68Mb L: 103/134 MS: 5 CMP-CopyPart-ShuffleBytes-ChangeByte-CrossOver- DE: "`\320wV\001\177\000\000"- 00:08:50.645 #21 NEW cov: 10818 ft: 17852 corp: 11/974b lim: 320 exec/s: 10 rss: 68Mb L: 103/134 MS: 1 ChangeBinInt- 00:08:50.646 #21 DONE cov: 10818 ft: 17852 corp: 11/974b lim: 320 exec/s: 10 rss: 68Mb 00:08:50.646 ###### Recommended dictionary. ###### 00:08:50.646 "`\320wV\001\177\000\000" # Uses: 0 00:08:50.646 ###### End of recommended dictionary. ###### 00:08:50.646 Done 21 runs in 2 second(s) 00:08:50.904 07:34:01 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:50.904 07:34:01 -- ../common.sh@72 -- # (( i++ )) 00:08:50.904 07:34:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.904 07:34:01 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:50.904 07:34:01 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:50.904 07:34:01 -- vfio/run.sh@23 -- # local timen=1 00:08:50.904 07:34:01 -- vfio/run.sh@24 -- # local core=0x1 00:08:50.904 07:34:01 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:50.904 07:34:01 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:50.904 07:34:01 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:50.904 07:34:01 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:50.904 07:34:01 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:50.904 07:34:01 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:50.904 07:34:01 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:50.904 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:50.904 07:34:01 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:50.904 [2024-11-28 07:34:01.630795] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:50.904 [2024-11-28 07:34:01.630884] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662248 ] 00:08:50.904 EAL: No free 2048 kB hugepages reported on node 1 00:08:51.163 [2024-11-28 07:34:01.701251] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.163 [2024-11-28 07:34:01.736755] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:51.163 [2024-11-28 07:34:01.736913] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.163 INFO: Running with entropic power schedule (0xFF, 100). 00:08:51.163 INFO: Seed: 64132386 00:08:51.163 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:51.163 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:51.163 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:51.163 INFO: A corpus is not provided, starting from an empty corpus 00:08:51.163 #2 INITED exec/s: 0 rss: 60Mb 00:08:51.163 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:51.163 This may also happen if the target rejected all inputs we tried so far 00:08:51.681 NEW_FUNC[1/630]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:51.681 NEW_FUNC[2/630]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:51.681 #7 NEW cov: 10741 ft: 10483 corp: 2/52b lim: 320 exec/s: 0 rss: 65Mb L: 51/51 MS: 5 ChangeByte-ChangeBit-InsertRepeatedBytes-ChangeBit-InsertRepeatedBytes- 00:08:51.940 NEW_FUNC[1/2]: 0x15ed068 in _is_io_flags_valid /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:141 00:08:51.940 NEW_FUNC[2/2]: 0x1609fa8 in _nvme_md_excluded_from_xfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:54 00:08:51.940 #12 NEW cov: 10760 ft: 14207 corp: 3/155b lim: 320 exec/s: 0 rss: 67Mb L: 103/103 MS: 5 ChangeBit-InsertByte-EraseBytes-CrossOver-InsertRepeatedBytes- 00:08:52.199 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:52.199 #13 NEW cov: 10777 ft: 15694 corp: 4/206b lim: 320 exec/s: 0 rss: 68Mb L: 51/103 MS: 1 ChangeBinInt- 00:08:52.199 #14 NEW cov: 10777 ft: 16454 corp: 5/247b lim: 320 exec/s: 14 rss: 68Mb L: 41/103 MS: 1 CrossOver- 00:08:52.458 #20 NEW cov: 10777 ft: 16736 corp: 6/298b lim: 320 exec/s: 20 rss: 68Mb L: 51/103 MS: 1 ChangeByte- 00:08:52.717 #21 NEW cov: 10777 ft: 17073 corp: 7/331b lim: 320 exec/s: 21 rss: 68Mb L: 33/103 MS: 1 CrossOver- 00:08:52.976 #22 NEW cov: 10777 ft: 17545 corp: 8/382b lim: 320 exec/s: 22 rss: 68Mb L: 51/103 MS: 1 ChangeASCIIInt- 00:08:52.976 [2024-11-28 07:34:03.579312] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:52.976 [2024-11-28 07:34:03.579351] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:52.976 [2024-11-28 07:34:03.579363] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:52.976 [2024-11-28 07:34:03.579381] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:52.976 [2024-11-28 07:34:03.580312] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:52.976 [2024-11-28 07:34:03.580331] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:52.976 [2024-11-28 07:34:03.580348] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:52.976 NEW_FUNC[1/6]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:08:52.976 NEW_FUNC[2/6]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:08:52.976 #26 NEW cov: 10814 ft: 17986 corp: 9/430b lim: 320 exec/s: 26 rss: 68Mb L: 48/103 MS: 4 ChangeBit-ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:53.235 [2024-11-28 07:34:03.772175] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:53.235 [2024-11-28 07:34:03.772200] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:53.235 [2024-11-28 07:34:03.772211] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:53.235 [2024-11-28 07:34:03.772227] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.235 [2024-11-28 07:34:03.773178] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:53.235 [2024-11-28 07:34:03.773196] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.235 [2024-11-28 07:34:03.773212] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.235 #27 NEW cov: 10821 ft: 18037 corp: 10/478b lim: 320 exec/s: 27 rss: 68Mb L: 48/103 MS: 1 ChangeBinInt- 00:08:53.494 #28 NEW cov: 10821 ft: 18206 corp: 11/529b lim: 320 exec/s: 14 rss: 68Mb L: 51/103 MS: 1 ChangeBinInt- 00:08:53.494 #28 DONE cov: 10821 ft: 18206 corp: 11/529b lim: 320 exec/s: 14 rss: 68Mb 00:08:53.494 Done 28 runs in 2 second(s) 00:08:53.754 07:34:04 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:53.754 07:34:04 -- ../common.sh@72 -- # (( i++ )) 00:08:53.754 07:34:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.754 07:34:04 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:53.754 07:34:04 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:53.754 07:34:04 -- vfio/run.sh@23 -- # local timen=1 00:08:53.754 07:34:04 -- vfio/run.sh@24 -- # local core=0x1 00:08:53.754 07:34:04 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:53.754 07:34:04 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:53.754 07:34:04 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:53.754 07:34:04 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:53.754 07:34:04 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:53.754 07:34:04 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:53.754 07:34:04 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:53.754 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:53.754 07:34:04 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:53.754 [2024-11-28 07:34:04.354965] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:53.754 [2024-11-28 07:34:04.355033] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662300 ] 00:08:53.754 EAL: No free 2048 kB hugepages reported on node 1 00:08:53.754 [2024-11-28 07:34:04.423959] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.754 [2024-11-28 07:34:04.459793] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:53.754 [2024-11-28 07:34:04.459930] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.013 INFO: Running with entropic power schedule (0xFF, 100). 00:08:54.013 INFO: Seed: 2785133429 00:08:54.013 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:54.013 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:54.013 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:54.013 INFO: A corpus is not provided, starting from an empty corpus 00:08:54.013 #2 INITED exec/s: 0 rss: 59Mb 00:08:54.013 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:54.013 This may also happen if the target rejected all inputs we tried so far 00:08:54.013 [2024-11-28 07:34:04.732674] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.013 [2024-11-28 07:34:04.732719] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.537 NEW_FUNC[1/638]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:54.537 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:54.537 #6 NEW cov: 10780 ft: 10644 corp: 2/56b lim: 120 exec/s: 0 rss: 65Mb L: 55/55 MS: 4 CopyPart-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:54.537 [2024-11-28 07:34:05.183770] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.537 [2024-11-28 07:34:05.183813] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.537 #7 NEW cov: 10794 ft: 13151 corp: 3/111b lim: 120 exec/s: 0 rss: 67Mb L: 55/55 MS: 1 ChangeBinInt- 00:08:54.796 [2024-11-28 07:34:05.356903] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.796 [2024-11-28 07:34:05.356934] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.796 #9 NEW cov: 10794 ft: 13789 corp: 4/175b lim: 120 exec/s: 0 rss: 68Mb L: 64/64 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:54.796 [2024-11-28 07:34:05.539245] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.796 [2024-11-28 07:34:05.539275] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.054 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:55.054 #10 NEW cov: 10811 ft: 15676 corp: 5/239b lim: 120 exec/s: 0 rss: 68Mb L: 64/64 MS: 1 ChangeBinInt- 00:08:55.054 [2024-11-28 07:34:05.709767] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.054 [2024-11-28 07:34:05.709796] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.054 #11 NEW cov: 10811 ft: 16496 corp: 6/271b lim: 120 exec/s: 11 rss: 68Mb L: 32/64 MS: 1 EraseBytes- 00:08:55.313 [2024-11-28 07:34:05.882090] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.313 [2024-11-28 07:34:05.882120] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.313 #12 NEW cov: 10811 ft: 17001 corp: 7/306b lim: 120 exec/s: 12 rss: 68Mb L: 35/64 MS: 1 EraseBytes- 00:08:55.313 [2024-11-28 07:34:06.054859] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.313 [2024-11-28 07:34:06.054888] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.572 #14 NEW cov: 10811 ft: 17080 corp: 8/360b lim: 120 exec/s: 14 rss: 68Mb L: 54/64 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:55.572 [2024-11-28 07:34:06.227238] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.572 [2024-11-28 07:34:06.227267] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.572 #15 NEW cov: 10811 ft: 17471 corp: 9/392b lim: 120 exec/s: 15 rss: 68Mb L: 32/64 MS: 1 ShuffleBytes- 00:08:55.831 [2024-11-28 07:34:06.398742] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.831 [2024-11-28 07:34:06.398771] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.831 #16 NEW cov: 10818 ft: 17792 corp: 10/448b lim: 120 exec/s: 16 rss: 68Mb L: 56/64 MS: 1 InsertByte- 00:08:55.831 [2024-11-28 07:34:06.570395] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.831 [2024-11-28 07:34:06.570424] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.091 #17 NEW cov: 10818 ft: 18070 corp: 11/512b lim: 120 exec/s: 8 rss: 68Mb L: 64/64 MS: 1 ShuffleBytes- 00:08:56.091 #17 DONE cov: 10818 ft: 18070 corp: 11/512b lim: 120 exec/s: 8 rss: 68Mb 00:08:56.091 Done 17 runs in 2 second(s) 00:08:56.350 07:34:06 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:56.350 07:34:06 -- ../common.sh@72 -- # (( i++ )) 00:08:56.350 07:34:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:56.350 07:34:06 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:56.350 07:34:06 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:56.350 07:34:06 -- vfio/run.sh@23 -- # local timen=1 00:08:56.350 07:34:06 -- vfio/run.sh@24 -- # local core=0x1 00:08:56.350 07:34:06 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:56.350 07:34:06 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:56.350 07:34:06 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:56.350 07:34:06 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:56.350 07:34:06 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:56.350 07:34:06 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:56.350 07:34:06 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:56.350 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:56.350 07:34:06 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:56.351 [2024-11-28 07:34:06.964398] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:56.351 [2024-11-28 07:34:06.964472] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662344 ] 00:08:56.351 EAL: No free 2048 kB hugepages reported on node 1 00:08:56.351 [2024-11-28 07:34:07.033876] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.351 [2024-11-28 07:34:07.069523] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:56.351 [2024-11-28 07:34:07.069675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.610 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.610 INFO: Seed: 1107162983 00:08:56.610 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:56.610 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:56.610 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:56.610 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.610 #2 INITED exec/s: 0 rss: 60Mb 00:08:56.610 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.610 This may also happen if the target rejected all inputs we tried so far 00:08:56.610 [2024-11-28 07:34:07.326657] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.610 [2024-11-28 07:34:07.326699] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.129 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:57.129 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:57.129 #15 NEW cov: 10773 ft: 10697 corp: 2/46b lim: 90 exec/s: 0 rss: 65Mb L: 45/45 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:57.129 [2024-11-28 07:34:07.728438] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.129 [2024-11-28 07:34:07.728482] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.129 #18 NEW cov: 10790 ft: 12676 corp: 3/93b lim: 90 exec/s: 0 rss: 67Mb L: 47/47 MS: 3 ChangeByte-InsertByte-CrossOver- 00:08:57.129 [2024-11-28 07:34:07.842226] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.129 [2024-11-28 07:34:07.842261] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.387 #19 NEW cov: 10790 ft: 13934 corp: 4/138b lim: 90 exec/s: 0 rss: 68Mb L: 45/47 MS: 1 ChangeBinInt- 00:08:57.387 [2024-11-28 07:34:07.955923] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.388 [2024-11-28 07:34:07.955958] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.388 #25 NEW cov: 10790 ft: 14639 corp: 5/183b lim: 90 exec/s: 0 rss: 68Mb L: 45/47 MS: 1 ShuffleBytes- 00:08:57.388 [2024-11-28 07:34:08.070643] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.388 [2024-11-28 07:34:08.070679] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.388 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:57.388 #26 NEW cov: 10807 ft: 15327 corp: 6/228b lim: 90 exec/s: 0 rss: 68Mb L: 45/47 MS: 1 ChangeBit- 00:08:57.646 [2024-11-28 07:34:08.184534] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.646 [2024-11-28 07:34:08.184570] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.646 #32 NEW cov: 10807 ft: 15650 corp: 7/296b lim: 90 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:08:57.646 [2024-11-28 07:34:08.299182] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.646 [2024-11-28 07:34:08.299217] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.646 #33 NEW cov: 10807 ft: 16161 corp: 8/330b lim: 90 exec/s: 33 rss: 68Mb L: 34/68 MS: 1 EraseBytes- 00:08:57.646 [2024-11-28 07:34:08.412879] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.646 [2024-11-28 07:34:08.412916] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.914 #34 NEW cov: 10807 ft: 16309 corp: 9/407b lim: 90 exec/s: 34 rss: 68Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:08:57.914 [2024-11-28 07:34:08.528641] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.914 [2024-11-28 07:34:08.528675] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.914 #35 NEW cov: 10807 ft: 16598 corp: 10/441b lim: 90 exec/s: 35 rss: 68Mb L: 34/77 MS: 1 ShuffleBytes- 00:08:57.914 [2024-11-28 07:34:08.640333] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.914 [2024-11-28 07:34:08.640368] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.207 #36 NEW cov: 10807 ft: 16659 corp: 11/486b lim: 90 exec/s: 36 rss: 68Mb L: 45/77 MS: 1 ShuffleBytes- 00:08:58.207 [2024-11-28 07:34:08.768632] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.207 [2024-11-28 07:34:08.768674] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.207 #37 NEW cov: 10807 ft: 17330 corp: 12/531b lim: 90 exec/s: 37 rss: 68Mb L: 45/77 MS: 1 ChangeBit- 00:08:58.207 [2024-11-28 07:34:08.942124] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.207 [2024-11-28 07:34:08.942155] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.539 #38 NEW cov: 10807 ft: 17570 corp: 13/576b lim: 90 exec/s: 38 rss: 68Mb L: 45/77 MS: 1 ChangeByte- 00:08:58.539 [2024-11-28 07:34:09.115312] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.539 [2024-11-28 07:34:09.115341] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.539 #39 NEW cov: 10814 ft: 17751 corp: 14/659b lim: 90 exec/s: 39 rss: 68Mb L: 83/83 MS: 1 CopyPart- 00:08:58.539 [2024-11-28 07:34:09.287826] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.539 [2024-11-28 07:34:09.287857] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.799 #40 NEW cov: 10814 ft: 17898 corp: 15/694b lim: 90 exec/s: 20 rss: 68Mb L: 35/83 MS: 1 InsertByte- 00:08:58.799 #40 DONE cov: 10814 ft: 17898 corp: 15/694b lim: 90 exec/s: 20 rss: 68Mb 00:08:58.799 Done 40 runs in 2 second(s) 00:08:59.058 07:34:09 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:59.058 07:34:09 -- ../common.sh@72 -- # (( i++ )) 00:08:59.058 07:34:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:59.058 07:34:09 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:59.058 00:08:59.058 real 0m19.389s 00:08:59.058 user 0m27.318s 00:08:59.058 sys 0m1.852s 00:08:59.058 07:34:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:59.058 07:34:09 -- common/autotest_common.sh@10 -- # set +x 00:08:59.058 ************************************ 00:08:59.058 END TEST vfio_fuzz 00:08:59.058 ************************************ 00:08:59.058 00:08:59.058 real 1m22.022s 00:08:59.058 user 2m6.276s 00:08:59.058 sys 0m9.236s 00:08:59.058 07:34:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:59.058 07:34:09 -- common/autotest_common.sh@10 -- # set +x 00:08:59.058 ************************************ 00:08:59.058 END TEST llvm_fuzz 00:08:59.058 ************************************ 00:08:59.058 07:34:09 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:59.058 07:34:09 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:59.058 07:34:09 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:59.058 07:34:09 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:59.058 07:34:09 -- common/autotest_common.sh@10 -- # set +x 00:08:59.058 07:34:09 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:59.058 07:34:09 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:59.058 07:34:09 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:59.058 07:34:09 -- common/autotest_common.sh@10 -- # set +x 00:09:05.625 INFO: APP EXITING 00:09:05.625 INFO: killing all VMs 00:09:05.625 INFO: killing vhost app 00:09:05.625 INFO: EXIT DONE 00:09:07.531 Waiting for block devices as requested 00:09:07.531 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:07.531 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:07.791 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:07.791 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:07.791 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:07.791 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:08.051 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:08.051 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:08.051 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:08.310 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:08.310 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:08.310 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:08.570 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:08.570 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:08.570 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:08.830 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:08.830 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:13.028 Cleaning 00:09:13.028 Removing: /dev/shm/spdk_tgt_trace.pid1644451 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1641942 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1643236 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1644451 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1645248 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1645573 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1645927 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1646270 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1646604 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1646891 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1647177 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1647449 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1648191 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1651329 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1651800 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1652111 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1652176 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1652750 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1652918 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1653535 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1653603 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1653906 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1654175 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1654309 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1654483 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1655013 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1655159 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1655432 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1655759 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1656024 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1656089 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1656146 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1656414 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1656701 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1656973 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1657177 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1657331 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1657566 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1657832 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1658115 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1658387 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1658654 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1658797 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1658975 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659002 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659036 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659057 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659091 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659120 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659157 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659176 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659218 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659243 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659279 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659298 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659336 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659364 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659398 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659419 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659454 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659481 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659515 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659536 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659580 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659603 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659646 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659668 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659713 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659732 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659772 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659798 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659835 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659902 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1659994 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660260 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660304 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660353 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660394 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660442 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660541 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1660659 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661123 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661183 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661230 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661273 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661316 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661369 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661412 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661461 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661499 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661542 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661591 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661643 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661694 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661736 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661778 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661826 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661865 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1661919 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662051 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662098 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662151 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662193 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662248 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662300 00:09:13.028 Removing: /var/run/dpdk/spdk_pid1662344 00:09:13.028 Clean 00:09:13.028 killing process with pid 1594072 00:09:17.225 killing process with pid 1594069 00:09:17.225 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1594071) - No such process 00:09:17.225 Process with pid 1594071 is not found 00:09:17.225 killing process with pid 1594070 00:09:17.225 07:34:27 -- common/autotest_common.sh@1446 -- # return 0 00:09:17.225 07:34:27 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:17.225 07:34:27 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:17.225 07:34:27 -- common/autotest_common.sh@10 -- # set +x 00:09:17.225 07:34:27 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:17.225 07:34:27 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:17.225 07:34:27 -- common/autotest_common.sh@10 -- # set +x 00:09:17.225 07:34:27 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:17.225 07:34:27 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:17.225 07:34:27 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:17.225 07:34:27 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:17.225 07:34:27 -- spdk/autotest.sh@383 -- # hostname 00:09:17.225 07:34:27 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:17.225 geninfo: WARNING: invalid characters removed from testname! 00:09:21.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:21.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:21.422 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:27.991 07:34:38 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:34.562 07:34:45 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:39.836 07:34:49 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:44.071 07:34:54 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:49.346 07:34:59 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:53.539 07:35:04 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:58.816 07:35:08 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:58.816 07:35:08 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:58.816 07:35:08 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:58.816 07:35:08 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:58.816 07:35:08 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:58.816 07:35:08 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:58.816 07:35:08 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:58.816 07:35:08 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:58.816 07:35:08 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:58.816 07:35:08 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:58.816 07:35:08 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:58.816 07:35:08 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:58.816 07:35:08 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:58.816 07:35:08 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:58.816 07:35:08 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:58.816 07:35:08 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:58.816 07:35:08 -- scripts/common.sh@343 -- $ case "$op" in 00:09:58.816 07:35:08 -- scripts/common.sh@344 -- $ : 1 00:09:58.816 07:35:08 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:58.816 07:35:08 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:58.816 07:35:08 -- scripts/common.sh@364 -- $ decimal 1 00:09:58.816 07:35:08 -- scripts/common.sh@352 -- $ local d=1 00:09:58.816 07:35:08 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:58.816 07:35:08 -- scripts/common.sh@354 -- $ echo 1 00:09:58.816 07:35:08 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:58.816 07:35:08 -- scripts/common.sh@365 -- $ decimal 2 00:09:58.816 07:35:08 -- scripts/common.sh@352 -- $ local d=2 00:09:58.816 07:35:08 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:58.816 07:35:08 -- scripts/common.sh@354 -- $ echo 2 00:09:58.816 07:35:08 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:58.816 07:35:08 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:58.816 07:35:08 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:58.816 07:35:08 -- scripts/common.sh@367 -- $ return 0 00:09:58.816 07:35:08 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:58.816 07:35:08 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:58.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:58.816 --rc genhtml_branch_coverage=1 00:09:58.816 --rc genhtml_function_coverage=1 00:09:58.816 --rc genhtml_legend=1 00:09:58.816 --rc geninfo_all_blocks=1 00:09:58.816 --rc geninfo_unexecuted_blocks=1 00:09:58.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:58.816 ' 00:09:58.816 07:35:08 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:58.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:58.816 --rc genhtml_branch_coverage=1 00:09:58.816 --rc genhtml_function_coverage=1 00:09:58.816 --rc genhtml_legend=1 00:09:58.816 --rc geninfo_all_blocks=1 00:09:58.816 --rc geninfo_unexecuted_blocks=1 00:09:58.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:58.816 ' 00:09:58.816 07:35:08 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:58.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:58.816 --rc genhtml_branch_coverage=1 00:09:58.816 --rc genhtml_function_coverage=1 00:09:58.816 --rc genhtml_legend=1 00:09:58.816 --rc geninfo_all_blocks=1 00:09:58.816 --rc geninfo_unexecuted_blocks=1 00:09:58.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:58.816 ' 00:09:58.816 07:35:08 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:58.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:58.816 --rc genhtml_branch_coverage=1 00:09:58.816 --rc genhtml_function_coverage=1 00:09:58.816 --rc genhtml_legend=1 00:09:58.816 --rc geninfo_all_blocks=1 00:09:58.816 --rc geninfo_unexecuted_blocks=1 00:09:58.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:58.817 ' 00:09:58.817 07:35:08 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:58.817 07:35:08 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:58.817 07:35:08 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:58.817 07:35:08 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:58.817 07:35:08 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:58.817 07:35:08 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:58.817 07:35:08 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:58.817 07:35:08 -- paths/export.sh@5 -- $ export PATH 00:09:58.817 07:35:08 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:58.817 07:35:08 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:58.817 07:35:08 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:58.817 07:35:08 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732775708.XXXXXX 00:09:58.817 07:35:08 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732775708.uIEMjw 00:09:58.817 07:35:08 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:58.817 07:35:08 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:09:58.817 07:35:08 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:58.817 07:35:08 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:58.817 07:35:08 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:58.817 07:35:08 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:58.817 07:35:08 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:58.817 07:35:08 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:58.817 07:35:08 -- common/autotest_common.sh@10 -- $ set +x 00:09:58.817 07:35:08 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:58.817 07:35:08 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:58.817 07:35:08 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:58.817 07:35:08 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:58.817 07:35:08 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:58.817 07:35:08 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:58.817 07:35:08 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:58.817 07:35:08 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:58.817 07:35:08 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:58.817 07:35:08 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:58.817 07:35:08 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:58.817 + [[ -n 1538193 ]] 00:09:58.817 + sudo kill 1538193 00:09:58.827 [Pipeline] } 00:09:58.844 [Pipeline] // stage 00:09:58.851 [Pipeline] } 00:09:58.869 [Pipeline] // timeout 00:09:58.875 [Pipeline] } 00:09:58.893 [Pipeline] // catchError 00:09:58.899 [Pipeline] } 00:09:58.915 [Pipeline] // wrap 00:09:58.921 [Pipeline] } 00:09:58.934 [Pipeline] // catchError 00:09:58.943 [Pipeline] stage 00:09:58.946 [Pipeline] { (Epilogue) 00:09:58.959 [Pipeline] catchError 00:09:58.960 [Pipeline] { 00:09:58.974 [Pipeline] echo 00:09:58.976 Cleanup processes 00:09:58.982 [Pipeline] sh 00:09:59.269 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:59.269 1668359 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:59.284 [Pipeline] sh 00:09:59.570 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:59.570 ++ grep -v 'sudo pgrep' 00:09:59.570 ++ awk '{print $1}' 00:09:59.570 + sudo kill -9 00:09:59.570 + true 00:09:59.583 [Pipeline] sh 00:09:59.869 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:59.869 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:59.869 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:00.805 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:10.801 [Pipeline] sh 00:10:11.189 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:11.189 Artifacts sizes are good 00:10:11.204 [Pipeline] archiveArtifacts 00:10:11.212 Archiving artifacts 00:10:11.336 [Pipeline] sh 00:10:11.621 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:11.636 [Pipeline] cleanWs 00:10:11.646 [WS-CLEANUP] Deleting project workspace... 00:10:11.646 [WS-CLEANUP] Deferred wipeout is used... 00:10:11.653 [WS-CLEANUP] done 00:10:11.655 [Pipeline] } 00:10:11.672 [Pipeline] // catchError 00:10:11.685 [Pipeline] sh 00:10:11.968 + logger -p user.info -t JENKINS-CI 00:10:11.977 [Pipeline] } 00:10:11.991 [Pipeline] // stage 00:10:11.996 [Pipeline] } 00:10:12.010 [Pipeline] // node 00:10:12.016 [Pipeline] End of Pipeline 00:10:12.054 Finished: SUCCESS