00:00:00.001 Started by upstream project "autotest-spdk-v24.09-vs-dpdk-v23.11" build number 170 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3671 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.120 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.126 The recommended git tool is: git 00:00:00.126 using credential 00000000-0000-0000-0000-000000000002 00:00:00.129 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.142 Fetching changes from the remote Git repository 00:00:00.144 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.164 Using shallow fetch with depth 1 00:00:00.164 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.164 > git --version # timeout=10 00:00:00.185 > git --version # 'git version 2.39.2' 00:00:00.185 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.217 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.217 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:05.737 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.749 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.759 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:05.759 > git config core.sparsecheckout # timeout=10 00:00:05.770 > git read-tree -mu HEAD # timeout=10 00:00:05.786 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:05.807 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:05.807 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:05.904 [Pipeline] Start of Pipeline 00:00:05.916 [Pipeline] library 00:00:05.919 Loading library shm_lib@master 00:00:05.919 Library shm_lib@master is cached. Copying from home. 00:00:05.934 [Pipeline] node 00:00:05.951 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:05.953 [Pipeline] { 00:00:05.961 [Pipeline] catchError 00:00:05.962 [Pipeline] { 00:00:05.973 [Pipeline] wrap 00:00:05.982 [Pipeline] { 00:00:05.990 [Pipeline] stage 00:00:05.992 [Pipeline] { (Prologue) 00:00:06.193 [Pipeline] sh 00:00:06.478 + logger -p user.info -t JENKINS-CI 00:00:06.494 [Pipeline] echo 00:00:06.495 Node: WFP20 00:00:06.502 [Pipeline] sh 00:00:06.827 [Pipeline] setCustomBuildProperty 00:00:06.835 [Pipeline] echo 00:00:06.836 Cleanup processes 00:00:06.839 [Pipeline] sh 00:00:07.119 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.119 1579766 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.133 [Pipeline] sh 00:00:07.416 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.416 ++ grep -v 'sudo pgrep' 00:00:07.416 ++ awk '{print $1}' 00:00:07.416 + sudo kill -9 00:00:07.416 + true 00:00:07.428 [Pipeline] cleanWs 00:00:07.437 [WS-CLEANUP] Deleting project workspace... 00:00:07.437 [WS-CLEANUP] Deferred wipeout is used... 00:00:07.443 [WS-CLEANUP] done 00:00:07.447 [Pipeline] setCustomBuildProperty 00:00:07.462 [Pipeline] sh 00:00:07.750 + sudo git config --global --replace-all safe.directory '*' 00:00:07.849 [Pipeline] httpRequest 00:00:08.304 [Pipeline] echo 00:00:08.306 Sorcerer 10.211.164.101 is alive 00:00:08.314 [Pipeline] retry 00:00:08.315 [Pipeline] { 00:00:08.329 [Pipeline] httpRequest 00:00:08.333 HttpMethod: GET 00:00:08.333 URL: http://10.211.164.101/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.334 Sending request to url: http://10.211.164.101/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.336 Response Code: HTTP/1.1 200 OK 00:00:08.336 Success: Status code 200 is in the accepted range: 200,404 00:00:08.336 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.116 [Pipeline] } 00:00:09.132 [Pipeline] // retry 00:00:09.139 [Pipeline] sh 00:00:09.421 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.434 [Pipeline] httpRequest 00:00:11.122 [Pipeline] echo 00:00:11.124 Sorcerer 10.211.164.101 is alive 00:00:11.136 [Pipeline] retry 00:00:11.138 [Pipeline] { 00:00:11.152 [Pipeline] httpRequest 00:00:11.157 HttpMethod: GET 00:00:11.157 URL: http://10.211.164.101/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:11.158 Sending request to url: http://10.211.164.101/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:11.179 Response Code: HTTP/1.1 200 OK 00:00:11.180 Success: Status code 200 is in the accepted range: 200,404 00:00:11.180 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:13.225 [Pipeline] } 00:01:13.241 [Pipeline] // retry 00:01:13.249 [Pipeline] sh 00:01:13.526 + tar --no-same-owner -xf spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:16.066 [Pipeline] sh 00:01:16.345 + git -C spdk log --oneline -n5 00:01:16.345 b18e1bd62 version: v24.09.1-pre 00:01:16.345 19524ad45 version: v24.09 00:01:16.345 9756b40a3 dpdk: update submodule to include alarm_cancel fix 00:01:16.345 a808500d2 test/nvmf: disable nvmf_shutdown_tc4 on e810 00:01:16.345 3024272c6 bdev/nvme: take nvme_ctrlr.mutex when setting keys 00:01:16.364 [Pipeline] withCredentials 00:01:16.375 > git --version # timeout=10 00:01:16.391 > git --version # 'git version 2.39.2' 00:01:16.407 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:16.409 [Pipeline] { 00:01:16.418 [Pipeline] retry 00:01:16.420 [Pipeline] { 00:01:16.436 [Pipeline] sh 00:01:16.717 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:16.730 [Pipeline] } 00:01:16.748 [Pipeline] // retry 00:01:16.753 [Pipeline] } 00:01:16.770 [Pipeline] // withCredentials 00:01:16.780 [Pipeline] httpRequest 00:01:17.205 [Pipeline] echo 00:01:17.207 Sorcerer 10.211.164.101 is alive 00:01:17.217 [Pipeline] retry 00:01:17.220 [Pipeline] { 00:01:17.234 [Pipeline] httpRequest 00:01:17.239 HttpMethod: GET 00:01:17.239 URL: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:17.240 Sending request to url: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:17.244 Response Code: HTTP/1.1 200 OK 00:01:17.245 Success: Status code 200 is in the accepted range: 200,404 00:01:17.245 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:15.004 [Pipeline] } 00:02:15.029 [Pipeline] // retry 00:02:15.039 [Pipeline] sh 00:02:15.326 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:16.715 [Pipeline] sh 00:02:16.998 + git -C dpdk log --oneline -n5 00:02:16.998 eeb0605f11 version: 23.11.0 00:02:16.998 238778122a doc: update release notes for 23.11 00:02:16.998 46aa6b3cfc doc: fix description of RSS features 00:02:16.998 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:16.998 7e421ae345 devtools: support skipping forbid rule check 00:02:17.008 [Pipeline] } 00:02:17.027 [Pipeline] // stage 00:02:17.039 [Pipeline] stage 00:02:17.042 [Pipeline] { (Prepare) 00:02:17.064 [Pipeline] writeFile 00:02:17.082 [Pipeline] sh 00:02:17.370 + logger -p user.info -t JENKINS-CI 00:02:17.384 [Pipeline] sh 00:02:17.673 + logger -p user.info -t JENKINS-CI 00:02:17.686 [Pipeline] sh 00:02:17.970 + cat autorun-spdk.conf 00:02:17.970 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:17.970 SPDK_TEST_FUZZER_SHORT=1 00:02:17.970 SPDK_TEST_FUZZER=1 00:02:17.970 SPDK_TEST_SETUP=1 00:02:17.970 SPDK_RUN_UBSAN=1 00:02:17.970 SPDK_TEST_NATIVE_DPDK=v23.11 00:02:17.970 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:17.977 RUN_NIGHTLY=1 00:02:17.984 [Pipeline] readFile 00:02:18.017 [Pipeline] withEnv 00:02:18.020 [Pipeline] { 00:02:18.035 [Pipeline] sh 00:02:18.322 + set -ex 00:02:18.322 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:18.322 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:18.322 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:18.322 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:18.322 ++ SPDK_TEST_FUZZER=1 00:02:18.322 ++ SPDK_TEST_SETUP=1 00:02:18.322 ++ SPDK_RUN_UBSAN=1 00:02:18.322 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:18.322 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:18.322 ++ RUN_NIGHTLY=1 00:02:18.322 + case $SPDK_TEST_NVMF_NICS in 00:02:18.322 + DRIVERS= 00:02:18.322 + [[ -n '' ]] 00:02:18.322 + exit 0 00:02:18.332 [Pipeline] } 00:02:18.350 [Pipeline] // withEnv 00:02:18.355 [Pipeline] } 00:02:18.372 [Pipeline] // stage 00:02:18.383 [Pipeline] catchError 00:02:18.385 [Pipeline] { 00:02:18.401 [Pipeline] timeout 00:02:18.401 Timeout set to expire in 30 min 00:02:18.404 [Pipeline] { 00:02:18.420 [Pipeline] stage 00:02:18.423 [Pipeline] { (Tests) 00:02:18.440 [Pipeline] sh 00:02:18.731 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:18.731 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:18.731 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:18.731 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:18.731 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:18.731 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:18.731 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:18.731 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:18.731 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:18.731 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:18.731 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:18.731 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:18.732 + source /etc/os-release 00:02:18.732 ++ NAME='Fedora Linux' 00:02:18.732 ++ VERSION='39 (Cloud Edition)' 00:02:18.732 ++ ID=fedora 00:02:18.732 ++ VERSION_ID=39 00:02:18.732 ++ VERSION_CODENAME= 00:02:18.732 ++ PLATFORM_ID=platform:f39 00:02:18.732 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:18.732 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:18.732 ++ LOGO=fedora-logo-icon 00:02:18.732 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:18.732 ++ HOME_URL=https://fedoraproject.org/ 00:02:18.732 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:18.732 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:18.732 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:18.732 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:18.732 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:18.732 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:18.732 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:18.732 ++ SUPPORT_END=2024-11-12 00:02:18.732 ++ VARIANT='Cloud Edition' 00:02:18.732 ++ VARIANT_ID=cloud 00:02:18.732 + uname -a 00:02:18.732 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:18.732 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:21.269 Hugepages 00:02:21.269 node hugesize free / total 00:02:21.269 node0 1048576kB 0 / 0 00:02:21.269 node0 2048kB 0 / 0 00:02:21.269 node1 1048576kB 0 / 0 00:02:21.269 node1 2048kB 0 / 0 00:02:21.269 00:02:21.269 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:21.269 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:21.269 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:21.529 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:21.529 + rm -f /tmp/spdk-ld-path 00:02:21.529 + source autorun-spdk.conf 00:02:21.529 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:21.529 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:21.529 ++ SPDK_TEST_FUZZER=1 00:02:21.529 ++ SPDK_TEST_SETUP=1 00:02:21.529 ++ SPDK_RUN_UBSAN=1 00:02:21.529 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:21.529 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:21.529 ++ RUN_NIGHTLY=1 00:02:21.529 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:21.529 + [[ -n '' ]] 00:02:21.529 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:21.529 + for M in /var/spdk/build-*-manifest.txt 00:02:21.529 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:21.529 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:21.529 + for M in /var/spdk/build-*-manifest.txt 00:02:21.529 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:21.529 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:21.529 + for M in /var/spdk/build-*-manifest.txt 00:02:21.529 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:21.529 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:21.529 ++ uname 00:02:21.529 + [[ Linux == \L\i\n\u\x ]] 00:02:21.529 + sudo dmesg -T 00:02:21.529 + sudo dmesg --clear 00:02:21.529 + dmesg_pid=1581261 00:02:21.529 + [[ Fedora Linux == FreeBSD ]] 00:02:21.529 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:21.529 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:21.529 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:21.529 + [[ -x /usr/src/fio-static/fio ]] 00:02:21.529 + export FIO_BIN=/usr/src/fio-static/fio 00:02:21.529 + FIO_BIN=/usr/src/fio-static/fio 00:02:21.529 + sudo dmesg -Tw 00:02:21.529 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:21.529 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:21.529 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:21.529 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:21.529 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:21.529 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:21.529 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:21.529 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:21.529 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:21.529 Test configuration: 00:02:21.529 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:21.529 SPDK_TEST_FUZZER_SHORT=1 00:02:21.529 SPDK_TEST_FUZZER=1 00:02:21.529 SPDK_TEST_SETUP=1 00:02:21.529 SPDK_RUN_UBSAN=1 00:02:21.529 SPDK_TEST_NATIVE_DPDK=v23.11 00:02:21.529 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:21.788 RUN_NIGHTLY=1 11:57:50 -- common/autotest_common.sh@1680 -- $ [[ n == y ]] 00:02:21.788 11:57:50 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:21.788 11:57:50 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:21.788 11:57:50 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:21.788 11:57:50 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:21.789 11:57:50 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:21.789 11:57:50 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:21.789 11:57:50 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:21.789 11:57:50 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:21.789 11:57:50 -- paths/export.sh@5 -- $ export PATH 00:02:21.789 11:57:50 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:21.789 11:57:50 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:21.789 11:57:50 -- common/autobuild_common.sh@479 -- $ date +%s 00:02:21.789 11:57:50 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1732705070.XXXXXX 00:02:21.789 11:57:50 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1732705070.5xuvVa 00:02:21.789 11:57:50 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:02:21.789 11:57:50 -- common/autobuild_common.sh@485 -- $ '[' -n v23.11 ']' 00:02:21.789 11:57:50 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:21.789 11:57:50 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:21.789 11:57:50 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:21.789 11:57:50 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:21.789 11:57:50 -- common/autobuild_common.sh@495 -- $ get_config_params 00:02:21.789 11:57:50 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:02:21.789 11:57:50 -- common/autotest_common.sh@10 -- $ set +x 00:02:21.789 11:57:50 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:21.789 11:57:50 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:02:21.789 11:57:50 -- pm/common@17 -- $ local monitor 00:02:21.789 11:57:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:21.789 11:57:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:21.789 11:57:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:21.789 11:57:50 -- pm/common@21 -- $ date +%s 00:02:21.789 11:57:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:21.789 11:57:50 -- pm/common@21 -- $ date +%s 00:02:21.789 11:57:50 -- pm/common@25 -- $ sleep 1 00:02:21.789 11:57:50 -- pm/common@21 -- $ date +%s 00:02:21.789 11:57:50 -- pm/common@21 -- $ date +%s 00:02:21.789 11:57:50 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732705070 00:02:21.789 11:57:50 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732705070 00:02:21.789 11:57:50 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732705070 00:02:21.789 11:57:50 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732705070 00:02:21.789 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732705070_collect-cpu-load.pm.log 00:02:21.789 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732705070_collect-vmstat.pm.log 00:02:21.789 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732705070_collect-cpu-temp.pm.log 00:02:21.789 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732705070_collect-bmc-pm.bmc.pm.log 00:02:22.727 11:57:51 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:02:22.727 11:57:51 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:22.727 11:57:51 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:22.727 11:57:51 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:22.727 11:57:51 -- spdk/autobuild.sh@16 -- $ date -u 00:02:22.727 Wed Nov 27 10:57:51 AM UTC 2024 00:02:22.727 11:57:51 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:22.727 v24.09-rc1-9-gb18e1bd62 00:02:22.727 11:57:51 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:22.727 11:57:51 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:22.727 11:57:51 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:22.727 11:57:51 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:02:22.727 11:57:51 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:22.727 11:57:51 -- common/autotest_common.sh@10 -- $ set +x 00:02:22.727 ************************************ 00:02:22.727 START TEST ubsan 00:02:22.727 ************************************ 00:02:22.727 11:57:51 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:02:22.727 using ubsan 00:02:22.727 00:02:22.727 real 0m0.001s 00:02:22.727 user 0m0.000s 00:02:22.727 sys 0m0.000s 00:02:22.727 11:57:51 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:22.727 11:57:51 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:22.727 ************************************ 00:02:22.727 END TEST ubsan 00:02:22.727 ************************************ 00:02:22.987 11:57:51 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:02:22.987 11:57:51 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:22.987 11:57:51 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:22.987 11:57:51 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:22.987 11:57:51 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:22.987 11:57:51 -- common/autotest_common.sh@10 -- $ set +x 00:02:22.987 ************************************ 00:02:22.987 START TEST build_native_dpdk 00:02:22.987 ************************************ 00:02:22.987 11:57:51 build_native_dpdk -- common/autotest_common.sh@1125 -- $ _build_native_dpdk 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:22.987 11:57:51 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:22.988 eeb0605f11 version: 23.11.0 00:02:22.988 238778122a doc: update release notes for 23.11 00:02:22.988 46aa6b3cfc doc: fix description of RSS features 00:02:22.988 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:22.988 7e421ae345 devtools: support skipping forbid rule check 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:02:22.988 patching file config/rte_config.h 00:02:22.988 Hunk #1 succeeded at 60 (offset 1 line). 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 24.07.0 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:02:22.988 patching file lib/pcapng/rte_pcapng.c 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 23.11.0 24.07.0 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 23.11.0 '>=' 24.07.0 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:22.988 11:57:51 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:22.988 11:57:51 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:28.269 The Meson build system 00:02:28.269 Version: 1.5.0 00:02:28.269 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:28.269 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:28.269 Build type: native build 00:02:28.269 Program cat found: YES (/usr/bin/cat) 00:02:28.269 Project name: DPDK 00:02:28.269 Project version: 23.11.0 00:02:28.269 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:28.269 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:28.269 Host machine cpu family: x86_64 00:02:28.269 Host machine cpu: x86_64 00:02:28.269 Message: ## Building in Developer Mode ## 00:02:28.269 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:28.269 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:28.269 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:28.269 Program python3 found: YES (/usr/bin/python3) 00:02:28.269 Program cat found: YES (/usr/bin/cat) 00:02:28.269 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:28.269 Compiler for C supports arguments -march=native: YES 00:02:28.269 Checking for size of "void *" : 8 00:02:28.269 Checking for size of "void *" : 8 (cached) 00:02:28.269 Library m found: YES 00:02:28.269 Library numa found: YES 00:02:28.269 Has header "numaif.h" : YES 00:02:28.269 Library fdt found: NO 00:02:28.269 Library execinfo found: NO 00:02:28.269 Has header "execinfo.h" : YES 00:02:28.269 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:28.269 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:28.269 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:28.269 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:28.269 Run-time dependency openssl found: YES 3.1.1 00:02:28.269 Run-time dependency libpcap found: YES 1.10.4 00:02:28.269 Has header "pcap.h" with dependency libpcap: YES 00:02:28.269 Compiler for C supports arguments -Wcast-qual: YES 00:02:28.269 Compiler for C supports arguments -Wdeprecated: YES 00:02:28.269 Compiler for C supports arguments -Wformat: YES 00:02:28.269 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:28.269 Compiler for C supports arguments -Wformat-security: NO 00:02:28.269 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:28.269 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:28.269 Compiler for C supports arguments -Wnested-externs: YES 00:02:28.269 Compiler for C supports arguments -Wold-style-definition: YES 00:02:28.269 Compiler for C supports arguments -Wpointer-arith: YES 00:02:28.269 Compiler for C supports arguments -Wsign-compare: YES 00:02:28.269 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:28.269 Compiler for C supports arguments -Wundef: YES 00:02:28.269 Compiler for C supports arguments -Wwrite-strings: YES 00:02:28.269 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:28.269 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:28.269 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:28.269 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:28.269 Program objdump found: YES (/usr/bin/objdump) 00:02:28.269 Compiler for C supports arguments -mavx512f: YES 00:02:28.269 Checking if "AVX512 checking" compiles: YES 00:02:28.269 Fetching value of define "__SSE4_2__" : 1 00:02:28.269 Fetching value of define "__AES__" : 1 00:02:28.269 Fetching value of define "__AVX__" : 1 00:02:28.269 Fetching value of define "__AVX2__" : 1 00:02:28.269 Fetching value of define "__AVX512BW__" : 1 00:02:28.269 Fetching value of define "__AVX512CD__" : 1 00:02:28.269 Fetching value of define "__AVX512DQ__" : 1 00:02:28.269 Fetching value of define "__AVX512F__" : 1 00:02:28.269 Fetching value of define "__AVX512VL__" : 1 00:02:28.269 Fetching value of define "__PCLMUL__" : 1 00:02:28.269 Fetching value of define "__RDRND__" : 1 00:02:28.269 Fetching value of define "__RDSEED__" : 1 00:02:28.269 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:28.269 Fetching value of define "__znver1__" : (undefined) 00:02:28.269 Fetching value of define "__znver2__" : (undefined) 00:02:28.269 Fetching value of define "__znver3__" : (undefined) 00:02:28.269 Fetching value of define "__znver4__" : (undefined) 00:02:28.269 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:28.269 Message: lib/log: Defining dependency "log" 00:02:28.269 Message: lib/kvargs: Defining dependency "kvargs" 00:02:28.269 Message: lib/telemetry: Defining dependency "telemetry" 00:02:28.269 Checking for function "getentropy" : NO 00:02:28.269 Message: lib/eal: Defining dependency "eal" 00:02:28.269 Message: lib/ring: Defining dependency "ring" 00:02:28.269 Message: lib/rcu: Defining dependency "rcu" 00:02:28.269 Message: lib/mempool: Defining dependency "mempool" 00:02:28.269 Message: lib/mbuf: Defining dependency "mbuf" 00:02:28.269 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:28.269 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:28.269 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:28.269 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:28.269 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:28.269 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:28.269 Compiler for C supports arguments -mpclmul: YES 00:02:28.269 Compiler for C supports arguments -maes: YES 00:02:28.269 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:28.269 Compiler for C supports arguments -mavx512bw: YES 00:02:28.269 Compiler for C supports arguments -mavx512dq: YES 00:02:28.269 Compiler for C supports arguments -mavx512vl: YES 00:02:28.269 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:28.269 Compiler for C supports arguments -mavx2: YES 00:02:28.269 Compiler for C supports arguments -mavx: YES 00:02:28.269 Message: lib/net: Defining dependency "net" 00:02:28.269 Message: lib/meter: Defining dependency "meter" 00:02:28.269 Message: lib/ethdev: Defining dependency "ethdev" 00:02:28.269 Message: lib/pci: Defining dependency "pci" 00:02:28.269 Message: lib/cmdline: Defining dependency "cmdline" 00:02:28.269 Message: lib/metrics: Defining dependency "metrics" 00:02:28.269 Message: lib/hash: Defining dependency "hash" 00:02:28.269 Message: lib/timer: Defining dependency "timer" 00:02:28.269 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:28.269 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:28.270 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:28.270 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:28.270 Message: lib/acl: Defining dependency "acl" 00:02:28.270 Message: lib/bbdev: Defining dependency "bbdev" 00:02:28.270 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:28.270 Run-time dependency libelf found: YES 0.191 00:02:28.270 Message: lib/bpf: Defining dependency "bpf" 00:02:28.270 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:28.270 Message: lib/compressdev: Defining dependency "compressdev" 00:02:28.270 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:28.270 Message: lib/distributor: Defining dependency "distributor" 00:02:28.270 Message: lib/dmadev: Defining dependency "dmadev" 00:02:28.270 Message: lib/efd: Defining dependency "efd" 00:02:28.270 Message: lib/eventdev: Defining dependency "eventdev" 00:02:28.270 Message: lib/dispatcher: Defining dependency "dispatcher" 00:02:28.270 Message: lib/gpudev: Defining dependency "gpudev" 00:02:28.270 Message: lib/gro: Defining dependency "gro" 00:02:28.270 Message: lib/gso: Defining dependency "gso" 00:02:28.270 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:28.270 Message: lib/jobstats: Defining dependency "jobstats" 00:02:28.270 Message: lib/latencystats: Defining dependency "latencystats" 00:02:28.270 Message: lib/lpm: Defining dependency "lpm" 00:02:28.270 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:28.270 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:28.270 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:28.270 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:28.270 Message: lib/member: Defining dependency "member" 00:02:28.270 Message: lib/pcapng: Defining dependency "pcapng" 00:02:28.270 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:28.270 Message: lib/power: Defining dependency "power" 00:02:28.270 Message: lib/rawdev: Defining dependency "rawdev" 00:02:28.270 Message: lib/regexdev: Defining dependency "regexdev" 00:02:28.270 Message: lib/mldev: Defining dependency "mldev" 00:02:28.270 Message: lib/rib: Defining dependency "rib" 00:02:28.270 Message: lib/reorder: Defining dependency "reorder" 00:02:28.270 Message: lib/sched: Defining dependency "sched" 00:02:28.270 Message: lib/security: Defining dependency "security" 00:02:28.270 Message: lib/stack: Defining dependency "stack" 00:02:28.270 Has header "linux/userfaultfd.h" : YES 00:02:28.270 Has header "linux/vduse.h" : YES 00:02:28.270 Message: lib/vhost: Defining dependency "vhost" 00:02:28.270 Message: lib/ipsec: Defining dependency "ipsec" 00:02:28.270 Message: lib/pdcp: Defining dependency "pdcp" 00:02:28.270 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:28.270 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:28.270 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:28.270 Message: lib/fib: Defining dependency "fib" 00:02:28.270 Message: lib/port: Defining dependency "port" 00:02:28.270 Message: lib/pdump: Defining dependency "pdump" 00:02:28.270 Message: lib/table: Defining dependency "table" 00:02:28.270 Message: lib/pipeline: Defining dependency "pipeline" 00:02:28.270 Message: lib/graph: Defining dependency "graph" 00:02:28.270 Message: lib/node: Defining dependency "node" 00:02:28.270 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:29.213 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:29.213 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:29.213 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:29.213 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:29.213 Compiler for C supports arguments -Wno-unused-value: YES 00:02:29.213 Compiler for C supports arguments -Wno-format: YES 00:02:29.213 Compiler for C supports arguments -Wno-format-security: YES 00:02:29.213 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:29.213 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:29.213 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:29.213 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:29.213 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:29.213 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:29.213 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:29.213 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:29.213 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:29.213 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:29.213 Has header "sys/epoll.h" : YES 00:02:29.213 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:29.213 Configuring doxy-api-html.conf using configuration 00:02:29.213 Configuring doxy-api-man.conf using configuration 00:02:29.213 Program mandb found: YES (/usr/bin/mandb) 00:02:29.213 Program sphinx-build found: NO 00:02:29.213 Configuring rte_build_config.h using configuration 00:02:29.213 Message: 00:02:29.213 ================= 00:02:29.213 Applications Enabled 00:02:29.213 ================= 00:02:29.213 00:02:29.213 apps: 00:02:29.213 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:02:29.213 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:02:29.213 test-pmd, test-regex, test-sad, test-security-perf, 00:02:29.213 00:02:29.213 Message: 00:02:29.213 ================= 00:02:29.213 Libraries Enabled 00:02:29.213 ================= 00:02:29.213 00:02:29.213 libs: 00:02:29.213 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:29.213 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:02:29.213 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:02:29.213 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:02:29.213 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:02:29.214 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:02:29.214 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:02:29.214 00:02:29.214 00:02:29.214 Message: 00:02:29.214 =============== 00:02:29.214 Drivers Enabled 00:02:29.214 =============== 00:02:29.214 00:02:29.214 common: 00:02:29.214 00:02:29.214 bus: 00:02:29.214 pci, vdev, 00:02:29.214 mempool: 00:02:29.214 ring, 00:02:29.214 dma: 00:02:29.214 00:02:29.214 net: 00:02:29.214 i40e, 00:02:29.214 raw: 00:02:29.214 00:02:29.214 crypto: 00:02:29.214 00:02:29.214 compress: 00:02:29.214 00:02:29.214 regex: 00:02:29.214 00:02:29.214 ml: 00:02:29.214 00:02:29.214 vdpa: 00:02:29.214 00:02:29.214 event: 00:02:29.214 00:02:29.214 baseband: 00:02:29.214 00:02:29.214 gpu: 00:02:29.214 00:02:29.214 00:02:29.214 Message: 00:02:29.214 ================= 00:02:29.214 Content Skipped 00:02:29.214 ================= 00:02:29.214 00:02:29.214 apps: 00:02:29.214 00:02:29.214 libs: 00:02:29.214 00:02:29.214 drivers: 00:02:29.214 common/cpt: not in enabled drivers build config 00:02:29.214 common/dpaax: not in enabled drivers build config 00:02:29.214 common/iavf: not in enabled drivers build config 00:02:29.214 common/idpf: not in enabled drivers build config 00:02:29.214 common/mvep: not in enabled drivers build config 00:02:29.214 common/octeontx: not in enabled drivers build config 00:02:29.214 bus/auxiliary: not in enabled drivers build config 00:02:29.214 bus/cdx: not in enabled drivers build config 00:02:29.214 bus/dpaa: not in enabled drivers build config 00:02:29.214 bus/fslmc: not in enabled drivers build config 00:02:29.214 bus/ifpga: not in enabled drivers build config 00:02:29.214 bus/platform: not in enabled drivers build config 00:02:29.214 bus/vmbus: not in enabled drivers build config 00:02:29.214 common/cnxk: not in enabled drivers build config 00:02:29.214 common/mlx5: not in enabled drivers build config 00:02:29.214 common/nfp: not in enabled drivers build config 00:02:29.214 common/qat: not in enabled drivers build config 00:02:29.214 common/sfc_efx: not in enabled drivers build config 00:02:29.214 mempool/bucket: not in enabled drivers build config 00:02:29.214 mempool/cnxk: not in enabled drivers build config 00:02:29.214 mempool/dpaa: not in enabled drivers build config 00:02:29.214 mempool/dpaa2: not in enabled drivers build config 00:02:29.214 mempool/octeontx: not in enabled drivers build config 00:02:29.214 mempool/stack: not in enabled drivers build config 00:02:29.214 dma/cnxk: not in enabled drivers build config 00:02:29.214 dma/dpaa: not in enabled drivers build config 00:02:29.214 dma/dpaa2: not in enabled drivers build config 00:02:29.214 dma/hisilicon: not in enabled drivers build config 00:02:29.214 dma/idxd: not in enabled drivers build config 00:02:29.214 dma/ioat: not in enabled drivers build config 00:02:29.214 dma/skeleton: not in enabled drivers build config 00:02:29.214 net/af_packet: not in enabled drivers build config 00:02:29.214 net/af_xdp: not in enabled drivers build config 00:02:29.214 net/ark: not in enabled drivers build config 00:02:29.214 net/atlantic: not in enabled drivers build config 00:02:29.214 net/avp: not in enabled drivers build config 00:02:29.214 net/axgbe: not in enabled drivers build config 00:02:29.214 net/bnx2x: not in enabled drivers build config 00:02:29.214 net/bnxt: not in enabled drivers build config 00:02:29.214 net/bonding: not in enabled drivers build config 00:02:29.214 net/cnxk: not in enabled drivers build config 00:02:29.214 net/cpfl: not in enabled drivers build config 00:02:29.214 net/cxgbe: not in enabled drivers build config 00:02:29.214 net/dpaa: not in enabled drivers build config 00:02:29.214 net/dpaa2: not in enabled drivers build config 00:02:29.214 net/e1000: not in enabled drivers build config 00:02:29.214 net/ena: not in enabled drivers build config 00:02:29.214 net/enetc: not in enabled drivers build config 00:02:29.214 net/enetfec: not in enabled drivers build config 00:02:29.214 net/enic: not in enabled drivers build config 00:02:29.214 net/failsafe: not in enabled drivers build config 00:02:29.214 net/fm10k: not in enabled drivers build config 00:02:29.214 net/gve: not in enabled drivers build config 00:02:29.214 net/hinic: not in enabled drivers build config 00:02:29.214 net/hns3: not in enabled drivers build config 00:02:29.214 net/iavf: not in enabled drivers build config 00:02:29.214 net/ice: not in enabled drivers build config 00:02:29.214 net/idpf: not in enabled drivers build config 00:02:29.214 net/igc: not in enabled drivers build config 00:02:29.214 net/ionic: not in enabled drivers build config 00:02:29.214 net/ipn3ke: not in enabled drivers build config 00:02:29.214 net/ixgbe: not in enabled drivers build config 00:02:29.214 net/mana: not in enabled drivers build config 00:02:29.214 net/memif: not in enabled drivers build config 00:02:29.214 net/mlx4: not in enabled drivers build config 00:02:29.214 net/mlx5: not in enabled drivers build config 00:02:29.214 net/mvneta: not in enabled drivers build config 00:02:29.214 net/mvpp2: not in enabled drivers build config 00:02:29.214 net/netvsc: not in enabled drivers build config 00:02:29.214 net/nfb: not in enabled drivers build config 00:02:29.214 net/nfp: not in enabled drivers build config 00:02:29.214 net/ngbe: not in enabled drivers build config 00:02:29.214 net/null: not in enabled drivers build config 00:02:29.214 net/octeontx: not in enabled drivers build config 00:02:29.214 net/octeon_ep: not in enabled drivers build config 00:02:29.214 net/pcap: not in enabled drivers build config 00:02:29.214 net/pfe: not in enabled drivers build config 00:02:29.214 net/qede: not in enabled drivers build config 00:02:29.214 net/ring: not in enabled drivers build config 00:02:29.214 net/sfc: not in enabled drivers build config 00:02:29.214 net/softnic: not in enabled drivers build config 00:02:29.214 net/tap: not in enabled drivers build config 00:02:29.214 net/thunderx: not in enabled drivers build config 00:02:29.214 net/txgbe: not in enabled drivers build config 00:02:29.214 net/vdev_netvsc: not in enabled drivers build config 00:02:29.214 net/vhost: not in enabled drivers build config 00:02:29.214 net/virtio: not in enabled drivers build config 00:02:29.214 net/vmxnet3: not in enabled drivers build config 00:02:29.214 raw/cnxk_bphy: not in enabled drivers build config 00:02:29.214 raw/cnxk_gpio: not in enabled drivers build config 00:02:29.214 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:29.214 raw/ifpga: not in enabled drivers build config 00:02:29.214 raw/ntb: not in enabled drivers build config 00:02:29.214 raw/skeleton: not in enabled drivers build config 00:02:29.214 crypto/armv8: not in enabled drivers build config 00:02:29.214 crypto/bcmfs: not in enabled drivers build config 00:02:29.214 crypto/caam_jr: not in enabled drivers build config 00:02:29.214 crypto/ccp: not in enabled drivers build config 00:02:29.214 crypto/cnxk: not in enabled drivers build config 00:02:29.214 crypto/dpaa_sec: not in enabled drivers build config 00:02:29.214 crypto/dpaa2_sec: not in enabled drivers build config 00:02:29.214 crypto/ipsec_mb: not in enabled drivers build config 00:02:29.214 crypto/mlx5: not in enabled drivers build config 00:02:29.214 crypto/mvsam: not in enabled drivers build config 00:02:29.214 crypto/nitrox: not in enabled drivers build config 00:02:29.214 crypto/null: not in enabled drivers build config 00:02:29.214 crypto/octeontx: not in enabled drivers build config 00:02:29.214 crypto/openssl: not in enabled drivers build config 00:02:29.214 crypto/scheduler: not in enabled drivers build config 00:02:29.214 crypto/uadk: not in enabled drivers build config 00:02:29.214 crypto/virtio: not in enabled drivers build config 00:02:29.214 compress/isal: not in enabled drivers build config 00:02:29.214 compress/mlx5: not in enabled drivers build config 00:02:29.214 compress/octeontx: not in enabled drivers build config 00:02:29.214 compress/zlib: not in enabled drivers build config 00:02:29.214 regex/mlx5: not in enabled drivers build config 00:02:29.214 regex/cn9k: not in enabled drivers build config 00:02:29.214 ml/cnxk: not in enabled drivers build config 00:02:29.214 vdpa/ifc: not in enabled drivers build config 00:02:29.214 vdpa/mlx5: not in enabled drivers build config 00:02:29.214 vdpa/nfp: not in enabled drivers build config 00:02:29.214 vdpa/sfc: not in enabled drivers build config 00:02:29.214 event/cnxk: not in enabled drivers build config 00:02:29.214 event/dlb2: not in enabled drivers build config 00:02:29.214 event/dpaa: not in enabled drivers build config 00:02:29.214 event/dpaa2: not in enabled drivers build config 00:02:29.214 event/dsw: not in enabled drivers build config 00:02:29.214 event/opdl: not in enabled drivers build config 00:02:29.214 event/skeleton: not in enabled drivers build config 00:02:29.214 event/sw: not in enabled drivers build config 00:02:29.214 event/octeontx: not in enabled drivers build config 00:02:29.214 baseband/acc: not in enabled drivers build config 00:02:29.214 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:29.214 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:29.214 baseband/la12xx: not in enabled drivers build config 00:02:29.214 baseband/null: not in enabled drivers build config 00:02:29.214 baseband/turbo_sw: not in enabled drivers build config 00:02:29.214 gpu/cuda: not in enabled drivers build config 00:02:29.214 00:02:29.214 00:02:29.214 Build targets in project: 217 00:02:29.214 00:02:29.214 DPDK 23.11.0 00:02:29.214 00:02:29.214 User defined options 00:02:29.214 libdir : lib 00:02:29.214 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:29.214 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:29.214 c_link_args : 00:02:29.214 enable_docs : false 00:02:29.214 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:29.214 enable_kmods : false 00:02:29.214 machine : native 00:02:29.214 tests : false 00:02:29.214 00:02:29.215 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:29.215 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:29.526 11:57:58 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:29.526 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:29.526 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:29.526 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:29.526 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:29.526 [4/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:29.526 [5/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:29.526 [6/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:29.526 [7/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:29.526 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:29.841 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:29.841 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:29.841 [11/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:29.841 [12/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:29.841 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:29.841 [14/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:29.841 [15/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:29.841 [16/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:29.841 [17/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:29.841 [18/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:29.841 [19/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:29.841 [20/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:29.841 [21/707] Linking static target lib/librte_kvargs.a 00:02:29.841 [22/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:29.841 [23/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:29.841 [24/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:29.841 [25/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:29.841 [26/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:29.841 [27/707] Linking static target lib/librte_pci.a 00:02:29.841 [28/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:29.841 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:29.841 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:29.841 [31/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:29.841 [32/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:29.841 [33/707] Linking static target lib/librte_log.a 00:02:29.841 [34/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:29.841 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:29.841 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:30.133 [37/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.133 [38/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:30.133 [39/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:30.133 [40/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:30.133 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:30.133 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:30.133 [43/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:30.133 [44/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:30.133 [45/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:30.133 [46/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.133 [47/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:30.133 [48/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:30.133 [49/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:30.133 [50/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:30.133 [51/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:30.133 [52/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:30.133 [53/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:30.397 [54/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:30.397 [55/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:30.397 [56/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:30.397 [57/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:30.397 [58/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:30.397 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:30.397 [60/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:30.397 [61/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:30.397 [62/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:30.397 [63/707] Linking static target lib/librte_meter.a 00:02:30.397 [64/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:30.397 [65/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:30.397 [66/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:30.397 [67/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:30.397 [68/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:30.397 [69/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:30.397 [70/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:30.397 [71/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:30.397 [72/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:30.397 [73/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:30.397 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:30.397 [75/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:30.397 [76/707] Linking static target lib/librte_cmdline.a 00:02:30.397 [77/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:30.397 [78/707] Linking static target lib/librte_ring.a 00:02:30.397 [79/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:30.397 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:30.397 [81/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:30.397 [82/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:30.397 [83/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:30.397 [84/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:30.397 [85/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:30.397 [86/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:30.397 [87/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:30.397 [88/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:30.397 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:30.397 [90/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:30.397 [91/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:30.397 [92/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:30.397 [93/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:30.397 [94/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:30.397 [95/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:30.397 [96/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:30.397 [97/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:30.397 [98/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:30.397 [99/707] Linking static target lib/librte_metrics.a 00:02:30.397 [100/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:30.397 [101/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:30.397 [102/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:30.397 [103/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:30.397 [104/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:30.397 [105/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:30.397 [106/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:30.397 [107/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:30.397 [108/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:30.397 [109/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:30.397 [110/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:30.397 [111/707] Linking static target lib/librte_bitratestats.a 00:02:30.656 [112/707] Linking static target lib/librte_net.a 00:02:30.656 [113/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:30.656 [114/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:30.656 [115/707] Linking static target lib/librte_cfgfile.a 00:02:30.656 [116/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:30.656 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:30.656 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:30.656 [119/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:30.656 [120/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:30.656 [121/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:30.656 [122/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:30.656 [123/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:30.656 [124/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:30.656 [125/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:30.656 [126/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.656 [127/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.656 [128/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:30.656 [129/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:30.656 [130/707] Linking target lib/librte_log.so.24.0 00:02:30.656 [131/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:30.656 [132/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:30.656 [133/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:30.656 [134/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:30.656 [135/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:30.656 [136/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:30.656 [137/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.916 [138/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:30.916 [139/707] Linking static target lib/librte_timer.a 00:02:30.916 [140/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:30.916 [141/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:30.916 [142/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:30.916 [143/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:30.916 [144/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:30.916 [145/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:30.916 [146/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:30.916 [147/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:30.916 [148/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.916 [149/707] Linking static target lib/librte_mempool.a 00:02:30.916 [150/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:30.916 [151/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:30.916 [152/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:30.916 [153/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:30.916 [154/707] Linking static target lib/librte_bbdev.a 00:02:30.916 [155/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.916 [156/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:30.916 [157/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:30.916 [158/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:30.916 [159/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:30.916 [160/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:30.916 [161/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:30.916 [162/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:30.916 [163/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:30.916 [164/707] Linking target lib/librte_kvargs.so.24.0 00:02:30.916 [165/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:30.916 [166/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:30.916 [167/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:30.916 [168/707] Linking static target lib/librte_jobstats.a 00:02:31.188 [169/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:31.188 [170/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:31.188 [171/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.188 [172/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:31.188 [173/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:31.188 [174/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:31.188 [175/707] Linking static target lib/librte_compressdev.a 00:02:31.188 [176/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:31.188 [177/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:31.188 [178/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:31.188 [179/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.188 [180/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:31.188 [181/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:31.188 [182/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:31.188 [183/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:31.188 [184/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:31.188 [185/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:02:31.188 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:02:31.188 [187/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:02:31.188 [188/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:31.188 [189/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:31.188 [190/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:31.188 [191/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:02:31.188 [192/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:31.188 [193/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:31.188 [194/707] Linking static target lib/librte_dispatcher.a 00:02:31.188 [195/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:31.188 [196/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:31.188 [197/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:31.188 [198/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:31.188 [199/707] Linking static target lib/librte_latencystats.a 00:02:31.188 [200/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:02:31.188 [201/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:31.188 [202/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:31.188 [203/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:31.188 [204/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:31.188 [205/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:31.188 [206/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:31.453 [207/707] Linking static target lib/librte_rcu.a 00:02:31.453 [208/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:31.453 [209/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:31.453 [210/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:31.453 [211/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:31.453 [212/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:31.453 [213/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:31.453 [214/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:31.453 [215/707] Linking static target lib/librte_eal.a 00:02:31.453 [216/707] Linking static target lib/librte_telemetry.a 00:02:31.453 [217/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:31.453 [218/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:31.453 [219/707] Linking static target lib/librte_gro.a 00:02:31.453 [220/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:31.453 [221/707] Linking static target lib/librte_stack.a 00:02:31.453 [222/707] Linking static target lib/librte_gpudev.a 00:02:31.453 [223/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:31.453 [224/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:31.453 [225/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.453 [226/707] Linking static target lib/librte_dmadev.a 00:02:31.453 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:31.453 [228/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:31.453 [229/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:31.453 [230/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:31.453 [231/707] Linking static target lib/librte_regexdev.a 00:02:31.453 [232/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:31.453 [233/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:31.453 [234/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:31.453 [235/707] Linking static target lib/librte_gso.a 00:02:31.453 [236/707] Linking static target lib/librte_distributor.a 00:02:31.453 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:02:31.453 [238/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:31.453 [239/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:02:31.453 [240/707] Linking static target lib/librte_rawdev.a 00:02:31.453 [241/707] Linking static target lib/librte_mldev.a 00:02:31.453 [242/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:31.453 [243/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:31.453 [244/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:31.453 [245/707] Linking static target lib/librte_mbuf.a 00:02:31.453 [246/707] Linking static target lib/librte_power.a 00:02:31.453 [247/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:31.454 [248/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:31.454 [249/707] Linking static target lib/librte_ip_frag.a 00:02:31.717 [250/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:31.717 [251/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [252/707] Linking static target lib/librte_pcapng.a 00:02:31.717 [253/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:31.717 [254/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:02:31.717 [255/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:31.717 [256/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:31.717 [257/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:02:31.717 [258/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [259/707] Linking static target lib/librte_reorder.a 00:02:31.717 [260/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:31.717 [261/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:31.717 [262/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:31.717 [263/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [264/707] Linking static target lib/librte_security.a 00:02:31.717 [265/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:02:31.717 [266/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:31.717 [267/707] Linking static target lib/librte_bpf.a 00:02:31.717 [268/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:31.717 [269/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:31.717 [270/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [271/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:02:31.717 [272/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [273/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [274/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.717 [275/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:31.981 [276/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:31.981 [277/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:31.981 [278/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:02:31.981 [279/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:31.981 [280/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:31.981 [281/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:31.981 [282/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:31.981 [283/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [284/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:31.981 [285/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [286/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [287/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [288/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:31.981 [289/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:31.981 [290/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:31.981 [291/707] Linking static target lib/librte_rib.a 00:02:31.981 [292/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:31.981 [293/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [294/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [295/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:31.981 [296/707] Linking static target lib/librte_lpm.a 00:02:31.981 [297/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:02:31.981 [298/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:31.981 [299/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [300/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:31.981 [301/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:31.981 [302/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:31.981 [303/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.981 [304/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.241 [305/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:32.241 [306/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:32.241 [307/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:32.241 [308/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:32.241 [309/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:32.241 [310/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:32.241 [311/707] Linking target lib/librte_telemetry.so.24.0 00:02:32.241 [312/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:32.241 [313/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:32.241 [314/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:32.241 [315/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.241 [316/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.241 [317/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:32.241 [318/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:32.241 [319/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:32.241 [320/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:32.241 [321/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:32.241 [322/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:32.241 [323/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:32.241 [324/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.241 [325/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:32.241 [326/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:32.241 [327/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:32.241 [328/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:02:32.241 [329/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:32.241 [330/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:32.241 [331/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:32.513 [332/707] Linking static target lib/librte_efd.a 00:02:32.513 [333/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:32.513 [334/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:32.513 [335/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:32.513 [336/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:32.513 [337/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.513 [338/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:32.513 [339/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.513 [340/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:02:32.513 [341/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:32.513 [342/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:32.513 [343/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:32.513 [344/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:32.513 [345/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:32.513 [346/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:32.513 [347/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:32.513 [348/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:32.513 [349/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:32.513 [350/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:32.513 [351/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:32.513 [352/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:32.779 [353/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:32.780 [354/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.780 [355/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.780 [356/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:32.780 [357/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:32.780 [358/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:32.780 [359/707] Linking static target lib/librte_fib.a 00:02:32.780 [360/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:32.780 [361/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.780 [362/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:32.780 [363/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:02:32.780 [364/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:32.780 [365/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:32.780 [366/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.780 [367/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:32.780 [368/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.780 [369/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:32.780 [370/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:32.780 [371/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:32.780 [372/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:02:32.780 [373/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:02:32.780 [374/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:32.780 [375/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.780 [376/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:32.780 [377/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:32.780 [378/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:32.780 [379/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:33.043 [380/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:33.043 [381/707] Linking static target lib/librte_pdump.a 00:02:33.043 [382/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:33.043 [383/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:02:33.043 [384/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:02:33.043 [385/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:33.043 [386/707] Linking static target lib/librte_graph.a 00:02:33.043 [387/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:33.043 [388/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:33.043 [389/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:02:33.043 [390/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:02:33.043 [391/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:02:33.043 [392/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:02:33.043 [393/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:33.043 [394/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:02:33.043 [395/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:33.044 [396/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:02:33.044 [397/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:02:33.044 [398/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:02:33.044 [399/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:33.044 [400/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:02:33.044 [401/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:33.044 [402/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:33.044 [403/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:02:33.309 [404/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:33.309 [405/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:33.309 [406/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:33.309 [407/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:33.309 [408/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:02:33.309 [409/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:33.309 [410/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:33.309 [411/707] Linking static target drivers/librte_bus_vdev.a 00:02:33.309 [412/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:02:33.309 [413/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:02:33.309 [414/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:33.309 [415/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:02:33.309 [416/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:33.309 [417/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:33.309 [418/707] Linking static target lib/librte_sched.a 00:02:33.309 [419/707] Linking static target lib/librte_table.a 00:02:33.309 [420/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:02:33.309 [421/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.309 [422/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:02:33.309 [423/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:33.309 [424/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:33.309 [425/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:33.309 [426/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:33.309 [427/707] Linking static target lib/librte_cryptodev.a 00:02:33.309 [428/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:33.309 [429/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:33.309 [430/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:33.309 [431/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:02:33.309 [432/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.573 [433/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:33.573 [434/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:33.573 [435/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:33.573 [436/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:33.573 [437/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:33.573 [438/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:33.573 [439/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:33.573 [440/707] Linking static target drivers/librte_bus_pci.a 00:02:33.573 [441/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:33.573 [442/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:33.573 [443/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:33.573 [444/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:02:33.573 [445/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:02:33.573 [446/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:02:33.573 [447/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:33.573 [448/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:33.573 [449/707] Linking static target lib/librte_member.a 00:02:33.573 [450/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:33.573 [451/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:33.833 [452/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:33.833 [453/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:33.833 [454/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:33.833 [455/707] Linking static target lib/librte_ipsec.a 00:02:33.833 [456/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.833 [457/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:33.833 [458/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:33.833 [459/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:33.833 [460/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:33.833 [461/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:33.833 [462/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:33.833 [463/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:33.833 [464/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:33.833 [465/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:33.833 [466/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:33.833 [467/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:33.833 [468/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.833 [469/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:33.833 [470/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:33.833 [471/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:33.833 [472/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:33.833 [473/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:33.833 [474/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:33.833 [475/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:33.833 [476/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:33.833 [477/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:33.833 [478/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:33.833 [479/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:33.833 [480/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:33.833 [481/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:33.833 [482/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:33.833 [483/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:34.092 [484/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.092 [485/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:34.092 [486/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.092 [487/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:34.092 [488/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:34.092 [489/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:34.092 [490/707] Linking static target lib/librte_pdcp.a 00:02:34.092 [491/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:34.092 [492/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:34.092 [493/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:34.092 [494/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:34.092 [495/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:34.092 [496/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:34.092 [497/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:34.092 [498/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:34.092 [499/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:34.092 [500/707] Linking static target lib/librte_hash.a 00:02:34.092 [501/707] Linking static target lib/librte_port.a 00:02:34.092 [502/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:34.092 [503/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:34.092 [504/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.092 [505/707] Linking static target drivers/librte_mempool_ring.a 00:02:34.092 [506/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:34.092 [507/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:34.092 [508/707] Linking static target lib/librte_node.a 00:02:34.092 [509/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:34.092 [510/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.092 [511/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:34.092 [512/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:34.092 [513/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:34.092 [514/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:34.092 [515/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:34.092 [516/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:34.092 [517/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:34.092 [518/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:34.092 [519/707] Linking static target lib/acl/libavx2_tmp.a 00:02:34.092 [520/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.092 [521/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:34.092 [522/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:34.352 [523/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:34.352 [524/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:34.352 [525/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:34.352 [526/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:34.352 [527/707] Linking static target lib/librte_eventdev.a 00:02:34.352 [528/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:34.352 [529/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:34.352 [530/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.352 [531/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:34.352 [532/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:34.352 [533/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:34.352 [534/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:34.352 [535/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:34.352 [536/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:34.352 [537/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.352 [538/707] Linking static target lib/librte_acl.a 00:02:34.352 [539/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:34.352 [540/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:34.352 [541/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:34.352 [542/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:34.352 [543/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:34.352 [544/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:34.352 [545/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:34.612 [546/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:34.612 [547/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.612 [548/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:34.612 [549/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:34.612 [550/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:34.612 [551/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:34.612 [552/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:34.612 [553/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:34.612 [554/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:34.612 [555/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:34.612 [556/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:34.612 [557/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:34.612 [558/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:34.612 [559/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:34.872 [560/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:34.872 [561/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:34.872 [562/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.872 [563/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:34.872 [564/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:34.872 [565/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:34.872 [566/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.872 [567/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.872 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:34.872 [569/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:35.131 [570/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:35.131 [571/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:35.131 [572/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:35.131 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:35.390 [574/707] Linking static target lib/librte_ethdev.a 00:02:35.390 [575/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.649 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:35.649 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:35.907 [578/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:35.907 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:35.907 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:36.843 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:36.843 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:36.843 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:36.843 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:36.843 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:37.103 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:37.103 [587/707] Linking static target drivers/librte_net_i40e.a 00:02:37.362 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:37.930 [589/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.930 [590/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:38.189 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.756 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:44.026 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:44.026 [594/707] Linking target lib/librte_eal.so.24.0 00:02:44.026 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:44.026 [596/707] Linking target lib/librte_meter.so.24.0 00:02:44.026 [597/707] Linking target lib/librte_pci.so.24.0 00:02:44.026 [598/707] Linking target lib/librte_ring.so.24.0 00:02:44.026 [599/707] Linking target lib/librte_dmadev.so.24.0 00:02:44.026 [600/707] Linking target lib/librte_rawdev.so.24.0 00:02:44.026 [601/707] Linking target lib/librte_stack.so.24.0 00:02:44.026 [602/707] Linking target lib/librte_timer.so.24.0 00:02:44.026 [603/707] Linking target lib/librte_jobstats.so.24.0 00:02:44.026 [604/707] Linking target lib/librte_cfgfile.so.24.0 00:02:44.026 [605/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:44.026 [606/707] Linking target lib/librte_acl.so.24.0 00:02:44.026 [607/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:44.026 [608/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:44.026 [609/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:44.026 [610/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:44.026 [611/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:44.026 [612/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:44.026 [613/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:44.026 [614/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:44.026 [615/707] Linking target lib/librte_rcu.so.24.0 00:02:44.026 [616/707] Linking target lib/librte_mempool.so.24.0 00:02:44.026 [617/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:44.027 [618/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:44.027 [619/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:44.027 [620/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:44.027 [621/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:44.027 [622/707] Linking target lib/librte_rib.so.24.0 00:02:44.027 [623/707] Linking target lib/librte_mbuf.so.24.0 00:02:44.027 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:44.027 [625/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:44.286 [626/707] Linking target lib/librte_compressdev.so.24.0 00:02:44.286 [627/707] Linking target lib/librte_cryptodev.so.24.0 00:02:44.286 [628/707] Linking target lib/librte_mldev.so.24.0 00:02:44.286 [629/707] Linking target lib/librte_gpudev.so.24.0 00:02:44.286 [630/707] Linking target lib/librte_regexdev.so.24.0 00:02:44.286 [631/707] Linking target lib/librte_bbdev.so.24.0 00:02:44.286 [632/707] Linking target lib/librte_distributor.so.24.0 00:02:44.287 [633/707] Linking target lib/librte_net.so.24.0 00:02:44.287 [634/707] Linking target lib/librte_reorder.so.24.0 00:02:44.287 [635/707] Linking target lib/librte_sched.so.24.0 00:02:44.287 [636/707] Linking target lib/librte_fib.so.24.0 00:02:44.287 [637/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:44.287 [638/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:44.287 [639/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:44.287 [640/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:44.287 [641/707] Linking target lib/librte_hash.so.24.0 00:02:44.287 [642/707] Linking target lib/librte_security.so.24.0 00:02:44.287 [643/707] Linking target lib/librte_cmdline.so.24.0 00:02:44.287 [644/707] Linking target lib/librte_ethdev.so.24.0 00:02:44.546 [645/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:44.546 [646/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:44.546 [647/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:44.546 [648/707] Linking static target lib/librte_pipeline.a 00:02:44.546 [649/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:44.546 [650/707] Linking target lib/librte_efd.so.24.0 00:02:44.546 [651/707] Linking target lib/librte_lpm.so.24.0 00:02:44.546 [652/707] Linking target lib/librte_member.so.24.0 00:02:44.546 [653/707] Linking target lib/librte_pcapng.so.24.0 00:02:44.546 [654/707] Linking target lib/librte_ipsec.so.24.0 00:02:44.546 [655/707] Linking target lib/librte_metrics.so.24.0 00:02:44.546 [656/707] Linking target lib/librte_pdcp.so.24.0 00:02:44.546 [657/707] Linking target lib/librte_ip_frag.so.24.0 00:02:44.546 [658/707] Linking target lib/librte_gso.so.24.0 00:02:44.546 [659/707] Linking target lib/librte_gro.so.24.0 00:02:44.546 [660/707] Linking target lib/librte_power.so.24.0 00:02:44.546 [661/707] Linking target lib/librte_bpf.so.24.0 00:02:44.546 [662/707] Linking target lib/librte_eventdev.so.24.0 00:02:44.546 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:44.546 [664/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:44.546 [665/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:44.805 [666/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:44.805 [667/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:44.805 [668/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:44.805 [669/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:44.805 [670/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:44.805 [671/707] Linking target lib/librte_graph.so.24.0 00:02:44.805 [672/707] Linking target lib/librte_latencystats.so.24.0 00:02:44.805 [673/707] Linking target lib/librte_bitratestats.so.24.0 00:02:44.805 [674/707] Linking target lib/librte_dispatcher.so.24.0 00:02:44.805 [675/707] Linking target lib/librte_pdump.so.24.0 00:02:44.805 [676/707] Linking target lib/librte_port.so.24.0 00:02:44.805 [677/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:44.805 [678/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:44.805 [679/707] Linking static target lib/librte_vhost.a 00:02:44.805 [680/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:44.805 [681/707] Linking target lib/librte_node.so.24.0 00:02:45.064 [682/707] Linking target lib/librte_table.so.24.0 00:02:45.064 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:45.323 [684/707] Linking target app/dpdk-test-fib 00:02:45.323 [685/707] Linking target app/dpdk-test-cmdline 00:02:45.323 [686/707] Linking target app/dpdk-proc-info 00:02:45.323 [687/707] Linking target app/dpdk-pdump 00:02:45.323 [688/707] Linking target app/dpdk-test-acl 00:02:45.323 [689/707] Linking target app/dpdk-test-dma-perf 00:02:45.323 [690/707] Linking target app/dpdk-test-sad 00:02:45.323 [691/707] Linking target app/dpdk-test-mldev 00:02:45.323 [692/707] Linking target app/dpdk-test-regex 00:02:45.323 [693/707] Linking target app/dpdk-dumpcap 00:02:45.323 [694/707] Linking target app/dpdk-test-gpudev 00:02:45.323 [695/707] Linking target app/dpdk-graph 00:02:45.323 [696/707] Linking target app/dpdk-test-pipeline 00:02:45.323 [697/707] Linking target app/dpdk-test-security-perf 00:02:45.323 [698/707] Linking target app/dpdk-test-compress-perf 00:02:45.323 [699/707] Linking target app/dpdk-test-crypto-perf 00:02:45.323 [700/707] Linking target app/dpdk-test-bbdev 00:02:45.323 [701/707] Linking target app/dpdk-test-flow-perf 00:02:45.323 [702/707] Linking target app/dpdk-test-eventdev 00:02:45.323 [703/707] Linking target app/dpdk-testpmd 00:02:47.229 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:47.229 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:50.532 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:50.532 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:50.532 11:58:18 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:50.532 11:58:18 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:50.532 11:58:18 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:50.533 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:50.533 [0/1] Installing files. 00:02:50.533 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.533 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.534 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.535 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.536 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:50.539 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:50.539 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.540 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.802 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.803 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.803 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.803 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:50.803 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.803 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:50.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:50.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:50.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:50.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:50.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:50.803 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.066 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.067 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.068 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:51.069 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:51.069 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:51.069 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:51.069 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:51.069 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:51.069 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:51.069 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:51.069 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:51.069 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:51.069 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:51.069 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:51.069 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:51.069 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:51.069 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:51.069 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:51.069 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:51.069 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:51.069 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:51.069 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:51.069 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:51.069 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:51.069 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:51.069 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:51.069 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:51.069 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:51.069 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:51.069 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:51.069 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:51.069 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:51.069 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:51.069 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:51.069 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:51.069 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:51.069 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:51.069 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:51.069 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:51.069 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:51.069 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:51.069 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:51.069 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:51.069 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:51.069 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:51.069 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:51.069 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:51.069 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:51.069 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:51.070 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:51.070 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:51.070 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:51.070 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:51.070 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:51.070 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:51.070 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:51.070 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:51.070 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:51.070 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:51.070 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:51.070 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:51.070 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:51.070 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:51.070 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:51.070 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:51.070 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:51.070 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:51.070 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:51.070 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:51.070 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:51.070 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:51.070 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:51.070 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:51.070 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:51.070 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:51.070 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:51.070 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:51.070 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:51.070 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:51.070 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:51.070 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:51.070 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:51.070 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:51.070 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:51.070 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:51.070 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:51.070 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:51.070 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:51.070 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:51.070 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:51.070 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:51.070 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:51.070 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:51.070 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:51.070 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:51.070 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:51.070 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:51.070 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:51.070 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:51.070 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:51.070 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:51.070 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:51.070 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:51.070 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:51.070 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:51.070 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:51.070 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:51.070 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:51.070 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:51.070 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:51.070 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:51.070 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:51.070 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:51.070 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:51.070 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:51.070 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:51.070 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:51.070 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:51.070 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:51.070 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:51.070 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:51.070 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:51.070 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:51.070 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:51.070 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:51.070 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:51.070 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:51.070 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:51.070 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:51.070 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:51.071 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:51.071 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:51.071 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:51.071 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:51.071 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:51.071 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:51.071 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:51.071 11:58:19 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:51.071 11:58:19 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:51.071 00:02:51.071 real 0m28.107s 00:02:51.071 user 8m4.537s 00:02:51.071 sys 2m29.495s 00:02:51.071 11:58:19 build_native_dpdk -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:51.071 11:58:19 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:51.071 ************************************ 00:02:51.071 END TEST build_native_dpdk 00:02:51.071 ************************************ 00:02:51.071 11:58:19 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:51.071 11:58:19 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:51.071 11:58:19 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:51.071 11:58:19 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:51.071 11:58:19 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:51.071 11:58:19 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:51.071 11:58:19 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:51.071 11:58:19 -- common/autotest_common.sh@10 -- $ set +x 00:02:51.071 ************************************ 00:02:51.071 START TEST autobuild_llvm_precompile 00:02:51.071 ************************************ 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autotest_common.sh@1125 -- $ _llvm_precompile 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:51.071 Target: x86_64-redhat-linux-gnu 00:02:51.071 Thread model: posix 00:02:51.071 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:51.071 11:58:19 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:51.331 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:51.590 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:51.590 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:51.590 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:52.158 Using 'verbs' RDMA provider 00:03:07.974 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:20.176 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:20.743 Creating mk/config.mk...done. 00:03:20.743 Creating mk/cc.flags.mk...done. 00:03:20.743 Type 'make' to build. 00:03:20.743 00:03:20.743 real 0m29.608s 00:03:20.743 user 0m12.969s 00:03:20.743 sys 0m16.052s 00:03:20.743 11:58:49 autobuild_llvm_precompile -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:20.743 11:58:49 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:20.743 ************************************ 00:03:20.743 END TEST autobuild_llvm_precompile 00:03:20.743 ************************************ 00:03:20.743 11:58:49 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:20.743 11:58:49 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:20.743 11:58:49 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:20.743 11:58:49 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:20.743 11:58:49 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:21.001 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:21.259 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:21.259 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:21.259 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:21.828 Using 'verbs' RDMA provider 00:03:34.968 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:44.949 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:45.778 Creating mk/config.mk...done. 00:03:45.778 Creating mk/cc.flags.mk...done. 00:03:45.778 Type 'make' to build. 00:03:45.778 11:59:14 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:45.778 11:59:14 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:03:45.778 11:59:14 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:03:45.778 11:59:14 -- common/autotest_common.sh@10 -- $ set +x 00:03:45.778 ************************************ 00:03:45.778 START TEST make 00:03:45.778 ************************************ 00:03:45.778 11:59:14 make -- common/autotest_common.sh@1125 -- $ make -j112 00:03:46.057 make[1]: Nothing to be done for 'all'. 00:03:47.961 The Meson build system 00:03:47.961 Version: 1.5.0 00:03:47.961 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:47.961 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:47.961 Build type: native build 00:03:47.961 Project name: libvfio-user 00:03:47.961 Project version: 0.0.1 00:03:47.961 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:47.961 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:47.961 Host machine cpu family: x86_64 00:03:47.961 Host machine cpu: x86_64 00:03:47.961 Run-time dependency threads found: YES 00:03:47.961 Library dl found: YES 00:03:47.961 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:47.961 Run-time dependency json-c found: YES 0.17 00:03:47.961 Run-time dependency cmocka found: YES 1.1.7 00:03:47.961 Program pytest-3 found: NO 00:03:47.961 Program flake8 found: NO 00:03:47.961 Program misspell-fixer found: NO 00:03:47.961 Program restructuredtext-lint found: NO 00:03:47.961 Program valgrind found: YES (/usr/bin/valgrind) 00:03:47.961 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:47.961 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:47.961 Compiler for C supports arguments -Wwrite-strings: YES 00:03:47.961 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:47.961 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:47.961 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:47.961 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:47.961 Build targets in project: 8 00:03:47.961 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:47.961 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:47.961 00:03:47.961 libvfio-user 0.0.1 00:03:47.961 00:03:47.961 User defined options 00:03:47.961 buildtype : debug 00:03:47.961 default_library: static 00:03:47.961 libdir : /usr/local/lib 00:03:47.961 00:03:47.961 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:47.961 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:48.220 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:48.220 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:48.220 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:48.220 [4/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:48.220 [5/36] Compiling C object samples/null.p/null.c.o 00:03:48.220 [6/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:48.220 [7/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:48.220 [8/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:48.220 [9/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:48.220 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:48.220 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:48.220 [12/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:48.220 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:48.220 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:48.220 [15/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:48.220 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:48.220 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:48.220 [18/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:48.220 [19/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:48.220 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:48.220 [21/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:48.220 [22/36] Compiling C object samples/server.p/server.c.o 00:03:48.220 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:48.220 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:48.220 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:48.220 [26/36] Compiling C object samples/client.p/client.c.o 00:03:48.220 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:48.220 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:48.220 [29/36] Linking static target lib/libvfio-user.a 00:03:48.220 [30/36] Linking target samples/client 00:03:48.220 [31/36] Linking target samples/gpio-pci-idio-16 00:03:48.220 [32/36] Linking target samples/shadow_ioeventfd_server 00:03:48.220 [33/36] Linking target test/unit_tests 00:03:48.220 [34/36] Linking target samples/lspci 00:03:48.220 [35/36] Linking target samples/null 00:03:48.220 [36/36] Linking target samples/server 00:03:48.220 INFO: autodetecting backend as ninja 00:03:48.220 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:48.479 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:48.738 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:48.738 ninja: no work to do. 00:04:00.947 CC lib/log/log.o 00:04:00.947 CC lib/log/log_flags.o 00:04:00.947 CC lib/log/log_deprecated.o 00:04:00.947 CC lib/ut/ut.o 00:04:00.947 CC lib/ut_mock/mock.o 00:04:00.947 LIB libspdk_log.a 00:04:00.947 LIB libspdk_ut.a 00:04:01.205 LIB libspdk_ut_mock.a 00:04:01.515 CC lib/ioat/ioat.o 00:04:01.515 CC lib/dma/dma.o 00:04:01.515 CC lib/util/base64.o 00:04:01.515 CC lib/util/bit_array.o 00:04:01.515 CXX lib/trace_parser/trace.o 00:04:01.515 CC lib/util/cpuset.o 00:04:01.515 CC lib/util/crc16.o 00:04:01.515 CC lib/util/crc32.o 00:04:01.515 CC lib/util/crc64.o 00:04:01.515 CC lib/util/crc32c.o 00:04:01.515 CC lib/util/crc32_ieee.o 00:04:01.515 CC lib/util/dif.o 00:04:01.515 CC lib/util/fd.o 00:04:01.515 CC lib/util/fd_group.o 00:04:01.515 CC lib/util/file.o 00:04:01.515 CC lib/util/hexlify.o 00:04:01.515 CC lib/util/iov.o 00:04:01.515 CC lib/util/math.o 00:04:01.515 CC lib/util/net.o 00:04:01.515 CC lib/util/pipe.o 00:04:01.515 CC lib/util/strerror_tls.o 00:04:01.515 CC lib/util/xor.o 00:04:01.515 CC lib/util/string.o 00:04:01.515 CC lib/util/uuid.o 00:04:01.515 CC lib/util/zipf.o 00:04:01.515 CC lib/util/md5.o 00:04:01.515 CC lib/vfio_user/host/vfio_user_pci.o 00:04:01.515 CC lib/vfio_user/host/vfio_user.o 00:04:01.515 LIB libspdk_dma.a 00:04:01.515 LIB libspdk_ioat.a 00:04:01.828 LIB libspdk_vfio_user.a 00:04:01.829 LIB libspdk_util.a 00:04:01.829 LIB libspdk_trace_parser.a 00:04:02.158 CC lib/json/json_parse.o 00:04:02.158 CC lib/conf/conf.o 00:04:02.158 CC lib/json/json_util.o 00:04:02.158 CC lib/json/json_write.o 00:04:02.158 CC lib/vmd/vmd.o 00:04:02.158 CC lib/vmd/led.o 00:04:02.158 CC lib/idxd/idxd_user.o 00:04:02.158 CC lib/env_dpdk/env.o 00:04:02.158 CC lib/rdma_utils/rdma_utils.o 00:04:02.158 CC lib/rdma_provider/common.o 00:04:02.158 CC lib/idxd/idxd.o 00:04:02.158 CC lib/env_dpdk/memory.o 00:04:02.158 CC lib/rdma_provider/rdma_provider_verbs.o 00:04:02.158 CC lib/env_dpdk/pci.o 00:04:02.158 CC lib/env_dpdk/init.o 00:04:02.158 CC lib/idxd/idxd_kernel.o 00:04:02.158 CC lib/env_dpdk/threads.o 00:04:02.158 CC lib/env_dpdk/pci_ioat.o 00:04:02.158 CC lib/env_dpdk/pci_vmd.o 00:04:02.158 CC lib/env_dpdk/pci_virtio.o 00:04:02.158 CC lib/env_dpdk/pci_idxd.o 00:04:02.158 CC lib/env_dpdk/pci_event.o 00:04:02.158 CC lib/env_dpdk/pci_dpdk_2207.o 00:04:02.158 CC lib/env_dpdk/sigbus_handler.o 00:04:02.158 CC lib/env_dpdk/pci_dpdk.o 00:04:02.158 CC lib/env_dpdk/pci_dpdk_2211.o 00:04:02.158 LIB libspdk_conf.a 00:04:02.158 LIB libspdk_rdma_provider.a 00:04:02.158 LIB libspdk_json.a 00:04:02.158 LIB libspdk_rdma_utils.a 00:04:02.417 LIB libspdk_idxd.a 00:04:02.417 LIB libspdk_vmd.a 00:04:02.417 CC lib/jsonrpc/jsonrpc_server.o 00:04:02.417 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:04:02.417 CC lib/jsonrpc/jsonrpc_client.o 00:04:02.417 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:04:02.676 LIB libspdk_jsonrpc.a 00:04:02.936 LIB libspdk_env_dpdk.a 00:04:02.936 CC lib/rpc/rpc.o 00:04:03.195 LIB libspdk_rpc.a 00:04:03.454 CC lib/keyring/keyring.o 00:04:03.454 CC lib/keyring/keyring_rpc.o 00:04:03.454 CC lib/notify/notify.o 00:04:03.454 CC lib/notify/notify_rpc.o 00:04:03.454 CC lib/trace/trace.o 00:04:03.454 CC lib/trace/trace_rpc.o 00:04:03.454 CC lib/trace/trace_flags.o 00:04:03.454 LIB libspdk_notify.a 00:04:03.713 LIB libspdk_keyring.a 00:04:03.713 LIB libspdk_trace.a 00:04:03.972 CC lib/thread/thread.o 00:04:03.972 CC lib/thread/iobuf.o 00:04:03.972 CC lib/sock/sock.o 00:04:03.972 CC lib/sock/sock_rpc.o 00:04:04.232 LIB libspdk_sock.a 00:04:04.490 CC lib/nvme/nvme_ctrlr_cmd.o 00:04:04.490 CC lib/nvme/nvme_ns_cmd.o 00:04:04.490 CC lib/nvme/nvme_ctrlr.o 00:04:04.490 CC lib/nvme/nvme_fabric.o 00:04:04.490 CC lib/nvme/nvme_ns.o 00:04:04.490 CC lib/nvme/nvme_pcie_common.o 00:04:04.490 CC lib/nvme/nvme.o 00:04:04.490 CC lib/nvme/nvme_pcie.o 00:04:04.490 CC lib/nvme/nvme_transport.o 00:04:04.490 CC lib/nvme/nvme_qpair.o 00:04:04.490 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:04:04.490 CC lib/nvme/nvme_quirks.o 00:04:04.490 CC lib/nvme/nvme_discovery.o 00:04:04.490 CC lib/nvme/nvme_tcp.o 00:04:04.490 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:04:04.490 CC lib/nvme/nvme_opal.o 00:04:04.490 CC lib/nvme/nvme_io_msg.o 00:04:04.490 CC lib/nvme/nvme_poll_group.o 00:04:04.490 CC lib/nvme/nvme_zns.o 00:04:04.490 CC lib/nvme/nvme_stubs.o 00:04:04.490 CC lib/nvme/nvme_auth.o 00:04:04.490 CC lib/nvme/nvme_cuse.o 00:04:04.490 CC lib/nvme/nvme_vfio_user.o 00:04:04.490 CC lib/nvme/nvme_rdma.o 00:04:04.748 LIB libspdk_thread.a 00:04:05.007 CC lib/vfu_tgt/tgt_endpoint.o 00:04:05.007 CC lib/vfu_tgt/tgt_rpc.o 00:04:05.007 CC lib/init/json_config.o 00:04:05.007 CC lib/init/subsystem.o 00:04:05.007 CC lib/init/rpc.o 00:04:05.007 CC lib/init/subsystem_rpc.o 00:04:05.007 CC lib/accel/accel_sw.o 00:04:05.007 CC lib/accel/accel.o 00:04:05.007 CC lib/accel/accel_rpc.o 00:04:05.007 CC lib/virtio/virtio_vfio_user.o 00:04:05.007 CC lib/virtio/virtio.o 00:04:05.007 CC lib/blob/blobstore.o 00:04:05.007 CC lib/blob/request.o 00:04:05.007 CC lib/virtio/virtio_vhost_user.o 00:04:05.007 CC lib/virtio/virtio_pci.o 00:04:05.007 CC lib/blob/zeroes.o 00:04:05.007 CC lib/blob/blob_bs_dev.o 00:04:05.007 CC lib/fsdev/fsdev.o 00:04:05.007 CC lib/fsdev/fsdev_io.o 00:04:05.007 CC lib/fsdev/fsdev_rpc.o 00:04:05.007 LIB libspdk_init.a 00:04:05.266 LIB libspdk_vfu_tgt.a 00:04:05.266 LIB libspdk_virtio.a 00:04:05.266 LIB libspdk_fsdev.a 00:04:05.266 CC lib/event/app.o 00:04:05.266 CC lib/event/reactor.o 00:04:05.267 CC lib/event/log_rpc.o 00:04:05.267 CC lib/event/app_rpc.o 00:04:05.267 CC lib/event/scheduler_static.o 00:04:05.526 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:04:05.526 LIB libspdk_event.a 00:04:05.785 LIB libspdk_accel.a 00:04:05.785 LIB libspdk_nvme.a 00:04:06.044 CC lib/bdev/bdev.o 00:04:06.044 CC lib/bdev/part.o 00:04:06.044 CC lib/bdev/bdev_rpc.o 00:04:06.044 CC lib/bdev/bdev_zone.o 00:04:06.044 CC lib/bdev/scsi_nvme.o 00:04:06.044 LIB libspdk_fuse_dispatcher.a 00:04:06.611 LIB libspdk_blob.a 00:04:07.178 CC lib/blobfs/blobfs.o 00:04:07.178 CC lib/blobfs/tree.o 00:04:07.178 CC lib/lvol/lvol.o 00:04:07.437 LIB libspdk_lvol.a 00:04:07.697 LIB libspdk_blobfs.a 00:04:07.697 LIB libspdk_bdev.a 00:04:07.954 CC lib/nbd/nbd.o 00:04:07.954 CC lib/nbd/nbd_rpc.o 00:04:07.954 CC lib/ublk/ublk.o 00:04:07.954 CC lib/ublk/ublk_rpc.o 00:04:07.954 CC lib/scsi/dev.o 00:04:07.954 CC lib/scsi/lun.o 00:04:07.954 CC lib/nvmf/ctrlr.o 00:04:07.954 CC lib/scsi/port.o 00:04:07.954 CC lib/nvmf/ctrlr_discovery.o 00:04:07.954 CC lib/scsi/scsi.o 00:04:07.954 CC lib/nvmf/ctrlr_bdev.o 00:04:07.954 CC lib/scsi/task.o 00:04:07.954 CC lib/scsi/scsi_bdev.o 00:04:07.954 CC lib/scsi/scsi_pr.o 00:04:07.954 CC lib/nvmf/subsystem.o 00:04:07.954 CC lib/scsi/scsi_rpc.o 00:04:07.954 CC lib/nvmf/nvmf.o 00:04:07.954 CC lib/nvmf/nvmf_rpc.o 00:04:07.954 CC lib/nvmf/transport.o 00:04:07.954 CC lib/nvmf/tcp.o 00:04:07.954 CC lib/ftl/ftl_core.o 00:04:07.954 CC lib/nvmf/stubs.o 00:04:07.954 CC lib/ftl/ftl_debug.o 00:04:07.954 CC lib/nvmf/mdns_server.o 00:04:07.954 CC lib/ftl/ftl_init.o 00:04:07.954 CC lib/ftl/ftl_layout.o 00:04:07.954 CC lib/nvmf/vfio_user.o 00:04:07.954 CC lib/nvmf/rdma.o 00:04:07.954 CC lib/nvmf/auth.o 00:04:07.954 CC lib/ftl/ftl_io.o 00:04:07.955 CC lib/ftl/ftl_sb.o 00:04:07.955 CC lib/ftl/ftl_l2p.o 00:04:07.955 CC lib/ftl/ftl_l2p_flat.o 00:04:07.955 CC lib/ftl/ftl_nv_cache.o 00:04:07.955 CC lib/ftl/ftl_band.o 00:04:07.955 CC lib/ftl/ftl_band_ops.o 00:04:07.955 CC lib/ftl/ftl_writer.o 00:04:07.955 CC lib/ftl/ftl_rq.o 00:04:07.955 CC lib/ftl/ftl_reloc.o 00:04:07.955 CC lib/ftl/ftl_l2p_cache.o 00:04:07.955 CC lib/ftl/ftl_p2l.o 00:04:07.955 CC lib/ftl/ftl_p2l_log.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:07.955 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:07.955 CC lib/ftl/utils/ftl_md.o 00:04:07.955 CC lib/ftl/utils/ftl_mempool.o 00:04:07.955 CC lib/ftl/utils/ftl_conf.o 00:04:07.955 CC lib/ftl/utils/ftl_bitmap.o 00:04:07.955 CC lib/ftl/utils/ftl_property.o 00:04:07.955 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:07.955 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:07.955 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:07.955 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:07.955 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:07.955 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:07.955 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:04:07.955 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:07.955 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:07.955 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:07.955 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:07.955 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:04:07.955 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:04:07.955 CC lib/ftl/base/ftl_base_dev.o 00:04:07.955 CC lib/ftl/base/ftl_base_bdev.o 00:04:07.955 CC lib/ftl/ftl_trace.o 00:04:08.520 LIB libspdk_nbd.a 00:04:08.520 LIB libspdk_ublk.a 00:04:08.520 LIB libspdk_scsi.a 00:04:08.520 LIB libspdk_ftl.a 00:04:08.778 CC lib/vhost/vhost_rpc.o 00:04:08.778 CC lib/vhost/vhost_scsi.o 00:04:08.778 CC lib/vhost/vhost.o 00:04:08.778 CC lib/vhost/vhost_blk.o 00:04:08.778 CC lib/vhost/rte_vhost_user.o 00:04:08.778 CC lib/iscsi/iscsi.o 00:04:08.778 CC lib/iscsi/conn.o 00:04:08.778 CC lib/iscsi/init_grp.o 00:04:08.778 CC lib/iscsi/portal_grp.o 00:04:08.778 CC lib/iscsi/param.o 00:04:08.778 CC lib/iscsi/tgt_node.o 00:04:08.778 CC lib/iscsi/task.o 00:04:08.778 CC lib/iscsi/iscsi_subsystem.o 00:04:08.778 CC lib/iscsi/iscsi_rpc.o 00:04:09.037 LIB libspdk_nvmf.a 00:04:09.296 LIB libspdk_vhost.a 00:04:09.555 LIB libspdk_iscsi.a 00:04:10.121 CC module/env_dpdk/env_dpdk_rpc.o 00:04:10.121 CC module/vfu_device/vfu_virtio.o 00:04:10.121 CC module/vfu_device/vfu_virtio_blk.o 00:04:10.121 CC module/vfu_device/vfu_virtio_scsi.o 00:04:10.121 CC module/vfu_device/vfu_virtio_fs.o 00:04:10.121 CC module/vfu_device/vfu_virtio_rpc.o 00:04:10.121 CC module/accel/dsa/accel_dsa.o 00:04:10.121 CC module/accel/dsa/accel_dsa_rpc.o 00:04:10.121 LIB libspdk_env_dpdk_rpc.a 00:04:10.121 CC module/accel/error/accel_error.o 00:04:10.121 CC module/accel/error/accel_error_rpc.o 00:04:10.121 CC module/keyring/linux/keyring.o 00:04:10.121 CC module/keyring/linux/keyring_rpc.o 00:04:10.121 CC module/blob/bdev/blob_bdev.o 00:04:10.121 CC module/keyring/file/keyring.o 00:04:10.121 CC module/keyring/file/keyring_rpc.o 00:04:10.121 CC module/fsdev/aio/fsdev_aio.o 00:04:10.121 CC module/fsdev/aio/linux_aio_mgr.o 00:04:10.121 CC module/fsdev/aio/fsdev_aio_rpc.o 00:04:10.121 CC module/scheduler/gscheduler/gscheduler.o 00:04:10.121 CC module/sock/posix/posix.o 00:04:10.121 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:10.121 CC module/accel/ioat/accel_ioat_rpc.o 00:04:10.121 CC module/accel/ioat/accel_ioat.o 00:04:10.121 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:10.122 CC module/accel/iaa/accel_iaa.o 00:04:10.122 CC module/accel/iaa/accel_iaa_rpc.o 00:04:10.122 LIB libspdk_keyring_linux.a 00:04:10.122 LIB libspdk_scheduler_gscheduler.a 00:04:10.122 LIB libspdk_accel_error.a 00:04:10.122 LIB libspdk_keyring_file.a 00:04:10.381 LIB libspdk_scheduler_dpdk_governor.a 00:04:10.381 LIB libspdk_scheduler_dynamic.a 00:04:10.381 LIB libspdk_accel_ioat.a 00:04:10.381 LIB libspdk_accel_iaa.a 00:04:10.381 LIB libspdk_blob_bdev.a 00:04:10.381 LIB libspdk_accel_dsa.a 00:04:10.381 LIB libspdk_vfu_device.a 00:04:10.640 LIB libspdk_fsdev_aio.a 00:04:10.640 LIB libspdk_sock_posix.a 00:04:10.640 CC module/bdev/null/bdev_null.o 00:04:10.640 CC module/bdev/null/bdev_null_rpc.o 00:04:10.640 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:10.640 CC module/bdev/delay/vbdev_delay.o 00:04:10.640 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:10.640 CC module/bdev/nvme/nvme_rpc.o 00:04:10.640 CC module/bdev/nvme/bdev_nvme.o 00:04:10.640 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:10.640 CC module/bdev/nvme/vbdev_opal.o 00:04:10.640 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:10.640 CC module/bdev/nvme/bdev_mdns_client.o 00:04:10.640 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:10.640 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:10.640 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:10.640 CC module/bdev/malloc/bdev_malloc.o 00:04:10.640 CC module/bdev/gpt/gpt.o 00:04:10.640 CC module/bdev/gpt/vbdev_gpt.o 00:04:10.640 CC module/blobfs/bdev/blobfs_bdev.o 00:04:10.640 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:10.640 CC module/bdev/raid/bdev_raid_rpc.o 00:04:10.640 CC module/bdev/raid/bdev_raid_sb.o 00:04:10.640 CC module/bdev/raid/bdev_raid.o 00:04:10.640 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:10.640 CC module/bdev/raid/raid1.o 00:04:10.640 CC module/bdev/error/vbdev_error.o 00:04:10.640 CC module/bdev/raid/raid0.o 00:04:10.640 CC module/bdev/passthru/vbdev_passthru.o 00:04:10.640 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:10.640 CC module/bdev/raid/concat.o 00:04:10.640 CC module/bdev/error/vbdev_error_rpc.o 00:04:10.640 CC module/bdev/split/vbdev_split_rpc.o 00:04:10.640 CC module/bdev/split/vbdev_split.o 00:04:10.640 CC module/bdev/aio/bdev_aio_rpc.o 00:04:10.640 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:10.640 CC module/bdev/aio/bdev_aio.o 00:04:10.640 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:10.640 CC module/bdev/lvol/vbdev_lvol.o 00:04:10.640 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:10.640 CC module/bdev/ftl/bdev_ftl.o 00:04:10.640 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:10.640 CC module/bdev/iscsi/bdev_iscsi.o 00:04:10.640 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:10.899 LIB libspdk_blobfs_bdev.a 00:04:10.899 LIB libspdk_bdev_null.a 00:04:10.899 LIB libspdk_bdev_split.a 00:04:10.899 LIB libspdk_bdev_gpt.a 00:04:10.899 LIB libspdk_bdev_error.a 00:04:10.899 LIB libspdk_bdev_passthru.a 00:04:10.899 LIB libspdk_bdev_ftl.a 00:04:10.899 LIB libspdk_bdev_delay.a 00:04:10.899 LIB libspdk_bdev_zone_block.a 00:04:10.899 LIB libspdk_bdev_aio.a 00:04:10.899 LIB libspdk_bdev_iscsi.a 00:04:10.899 LIB libspdk_bdev_malloc.a 00:04:11.158 LIB libspdk_bdev_virtio.a 00:04:11.158 LIB libspdk_bdev_lvol.a 00:04:11.416 LIB libspdk_bdev_raid.a 00:04:11.985 LIB libspdk_bdev_nvme.a 00:04:12.553 CC module/event/subsystems/iobuf/iobuf.o 00:04:12.553 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:12.553 CC module/event/subsystems/scheduler/scheduler.o 00:04:12.553 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:04:12.553 CC module/event/subsystems/sock/sock.o 00:04:12.553 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:12.553 CC module/event/subsystems/vmd/vmd.o 00:04:12.553 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:12.553 CC module/event/subsystems/keyring/keyring.o 00:04:12.553 CC module/event/subsystems/fsdev/fsdev.o 00:04:12.553 LIB libspdk_event_scheduler.a 00:04:12.553 LIB libspdk_event_iobuf.a 00:04:12.811 LIB libspdk_event_vfu_tgt.a 00:04:12.811 LIB libspdk_event_vhost_blk.a 00:04:12.811 LIB libspdk_event_keyring.a 00:04:12.811 LIB libspdk_event_vmd.a 00:04:12.811 LIB libspdk_event_sock.a 00:04:12.811 LIB libspdk_event_fsdev.a 00:04:12.811 CC module/event/subsystems/accel/accel.o 00:04:13.070 LIB libspdk_event_accel.a 00:04:13.329 CC module/event/subsystems/bdev/bdev.o 00:04:13.587 LIB libspdk_event_bdev.a 00:04:13.846 CC module/event/subsystems/nbd/nbd.o 00:04:13.846 CC module/event/subsystems/ublk/ublk.o 00:04:13.846 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:13.846 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:13.846 CC module/event/subsystems/scsi/scsi.o 00:04:13.846 LIB libspdk_event_nbd.a 00:04:13.846 LIB libspdk_event_ublk.a 00:04:13.846 LIB libspdk_event_scsi.a 00:04:13.846 LIB libspdk_event_nvmf.a 00:04:14.105 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:14.363 CC module/event/subsystems/iscsi/iscsi.o 00:04:14.363 LIB libspdk_event_vhost_scsi.a 00:04:14.363 LIB libspdk_event_iscsi.a 00:04:14.620 CC app/trace_record/trace_record.o 00:04:14.620 CXX app/trace/trace.o 00:04:14.620 CC app/spdk_nvme_discover/discovery_aer.o 00:04:14.620 CC app/spdk_nvme_perf/perf.o 00:04:14.620 CC app/spdk_top/spdk_top.o 00:04:14.620 CC app/spdk_lspci/spdk_lspci.o 00:04:14.620 CC app/spdk_nvme_identify/identify.o 00:04:14.620 CC test/rpc_client/rpc_client_test.o 00:04:14.620 TEST_HEADER include/spdk/accel_module.h 00:04:14.620 TEST_HEADER include/spdk/accel.h 00:04:14.620 TEST_HEADER include/spdk/assert.h 00:04:14.620 TEST_HEADER include/spdk/barrier.h 00:04:14.620 TEST_HEADER include/spdk/bdev_module.h 00:04:14.620 TEST_HEADER include/spdk/base64.h 00:04:14.620 TEST_HEADER include/spdk/bdev_zone.h 00:04:14.620 TEST_HEADER include/spdk/bdev.h 00:04:14.620 TEST_HEADER include/spdk/bit_array.h 00:04:14.620 TEST_HEADER include/spdk/bit_pool.h 00:04:14.620 TEST_HEADER include/spdk/blob_bdev.h 00:04:14.620 TEST_HEADER include/spdk/blobfs.h 00:04:14.620 TEST_HEADER include/spdk/blob.h 00:04:14.620 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:14.620 TEST_HEADER include/spdk/conf.h 00:04:14.620 TEST_HEADER include/spdk/cpuset.h 00:04:14.620 TEST_HEADER include/spdk/config.h 00:04:14.620 TEST_HEADER include/spdk/crc16.h 00:04:14.620 TEST_HEADER include/spdk/dif.h 00:04:14.620 TEST_HEADER include/spdk/crc64.h 00:04:14.620 TEST_HEADER include/spdk/crc32.h 00:04:14.620 TEST_HEADER include/spdk/dma.h 00:04:14.620 CC app/spdk_dd/spdk_dd.o 00:04:14.620 TEST_HEADER include/spdk/endian.h 00:04:14.620 TEST_HEADER include/spdk/env_dpdk.h 00:04:14.620 TEST_HEADER include/spdk/env.h 00:04:14.620 TEST_HEADER include/spdk/event.h 00:04:14.620 TEST_HEADER include/spdk/fd_group.h 00:04:14.620 CC app/nvmf_tgt/nvmf_main.o 00:04:14.620 TEST_HEADER include/spdk/ftl.h 00:04:14.620 TEST_HEADER include/spdk/fd.h 00:04:14.620 TEST_HEADER include/spdk/file.h 00:04:14.620 TEST_HEADER include/spdk/fsdev.h 00:04:14.620 TEST_HEADER include/spdk/fuse_dispatcher.h 00:04:14.620 TEST_HEADER include/spdk/gpt_spec.h 00:04:14.620 TEST_HEADER include/spdk/fsdev_module.h 00:04:14.620 TEST_HEADER include/spdk/hexlify.h 00:04:14.620 TEST_HEADER include/spdk/histogram_data.h 00:04:14.620 TEST_HEADER include/spdk/idxd_spec.h 00:04:14.620 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:14.620 TEST_HEADER include/spdk/ioat.h 00:04:14.620 TEST_HEADER include/spdk/idxd.h 00:04:14.620 CC app/iscsi_tgt/iscsi_tgt.o 00:04:14.620 TEST_HEADER include/spdk/init.h 00:04:14.620 TEST_HEADER include/spdk/iscsi_spec.h 00:04:14.620 TEST_HEADER include/spdk/jsonrpc.h 00:04:14.620 TEST_HEADER include/spdk/ioat_spec.h 00:04:14.620 TEST_HEADER include/spdk/keyring_module.h 00:04:14.620 TEST_HEADER include/spdk/json.h 00:04:14.620 TEST_HEADER include/spdk/likely.h 00:04:14.620 TEST_HEADER include/spdk/keyring.h 00:04:14.620 TEST_HEADER include/spdk/lvol.h 00:04:14.620 TEST_HEADER include/spdk/log.h 00:04:14.620 TEST_HEADER include/spdk/memory.h 00:04:14.620 TEST_HEADER include/spdk/md5.h 00:04:14.620 TEST_HEADER include/spdk/mmio.h 00:04:14.620 TEST_HEADER include/spdk/net.h 00:04:14.620 TEST_HEADER include/spdk/nbd.h 00:04:14.620 TEST_HEADER include/spdk/notify.h 00:04:14.620 TEST_HEADER include/spdk/nvme_intel.h 00:04:14.620 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:14.620 TEST_HEADER include/spdk/nvme.h 00:04:14.620 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:14.620 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:14.620 TEST_HEADER include/spdk/nvme_spec.h 00:04:14.620 TEST_HEADER include/spdk/nvme_zns.h 00:04:14.620 TEST_HEADER include/spdk/nvmf.h 00:04:14.620 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:14.620 TEST_HEADER include/spdk/nvmf_transport.h 00:04:14.882 TEST_HEADER include/spdk/nvmf_spec.h 00:04:14.882 TEST_HEADER include/spdk/opal.h 00:04:14.882 TEST_HEADER include/spdk/opal_spec.h 00:04:14.882 CC app/spdk_tgt/spdk_tgt.o 00:04:14.882 TEST_HEADER include/spdk/queue.h 00:04:14.882 TEST_HEADER include/spdk/pci_ids.h 00:04:14.882 TEST_HEADER include/spdk/pipe.h 00:04:14.882 TEST_HEADER include/spdk/reduce.h 00:04:14.882 TEST_HEADER include/spdk/rpc.h 00:04:14.882 TEST_HEADER include/spdk/scsi_spec.h 00:04:14.882 TEST_HEADER include/spdk/scheduler.h 00:04:14.882 TEST_HEADER include/spdk/sock.h 00:04:14.882 TEST_HEADER include/spdk/scsi.h 00:04:14.882 TEST_HEADER include/spdk/stdinc.h 00:04:14.882 TEST_HEADER include/spdk/string.h 00:04:14.882 TEST_HEADER include/spdk/trace.h 00:04:14.882 TEST_HEADER include/spdk/thread.h 00:04:14.882 TEST_HEADER include/spdk/ublk.h 00:04:14.882 TEST_HEADER include/spdk/trace_parser.h 00:04:14.882 TEST_HEADER include/spdk/uuid.h 00:04:14.882 TEST_HEADER include/spdk/tree.h 00:04:14.882 TEST_HEADER include/spdk/util.h 00:04:14.882 TEST_HEADER include/spdk/version.h 00:04:14.882 TEST_HEADER include/spdk/vhost.h 00:04:14.882 TEST_HEADER include/spdk/vmd.h 00:04:14.882 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:14.882 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:14.882 TEST_HEADER include/spdk/zipf.h 00:04:14.882 CXX test/cpp_headers/assert.o 00:04:14.882 TEST_HEADER include/spdk/xor.h 00:04:14.882 CXX test/cpp_headers/accel.o 00:04:14.882 CXX test/cpp_headers/accel_module.o 00:04:14.882 CXX test/cpp_headers/barrier.o 00:04:14.882 CXX test/cpp_headers/base64.o 00:04:14.882 CXX test/cpp_headers/bdev.o 00:04:14.882 CXX test/cpp_headers/bdev_zone.o 00:04:14.882 CXX test/cpp_headers/bdev_module.o 00:04:14.882 CXX test/cpp_headers/bit_array.o 00:04:14.882 CXX test/cpp_headers/blob_bdev.o 00:04:14.882 CXX test/cpp_headers/bit_pool.o 00:04:14.882 CXX test/cpp_headers/blobfs_bdev.o 00:04:14.882 CXX test/cpp_headers/blob.o 00:04:14.882 CXX test/cpp_headers/conf.o 00:04:14.882 CXX test/cpp_headers/blobfs.o 00:04:14.882 CXX test/cpp_headers/cpuset.o 00:04:14.882 CC test/env/vtophys/vtophys.o 00:04:14.882 CXX test/cpp_headers/crc16.o 00:04:14.882 CXX test/cpp_headers/config.o 00:04:14.882 CXX test/cpp_headers/crc32.o 00:04:14.882 CXX test/cpp_headers/dif.o 00:04:14.882 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:14.882 CC examples/ioat/perf/perf.o 00:04:14.882 CXX test/cpp_headers/crc64.o 00:04:14.882 CXX test/cpp_headers/dma.o 00:04:14.882 CC examples/ioat/verify/verify.o 00:04:14.882 CXX test/cpp_headers/env_dpdk.o 00:04:14.882 CXX test/cpp_headers/endian.o 00:04:14.882 CC test/env/memory/memory_ut.o 00:04:14.882 CXX test/cpp_headers/env.o 00:04:14.882 CC test/env/pci/pci_ut.o 00:04:14.882 CXX test/cpp_headers/fd_group.o 00:04:14.882 CXX test/cpp_headers/event.o 00:04:14.882 CC app/fio/nvme/fio_plugin.o 00:04:14.882 CXX test/cpp_headers/fd.o 00:04:14.882 LINK spdk_lspci 00:04:14.882 CXX test/cpp_headers/file.o 00:04:14.882 CXX test/cpp_headers/ftl.o 00:04:14.882 CXX test/cpp_headers/fsdev_module.o 00:04:14.882 CXX test/cpp_headers/gpt_spec.o 00:04:14.882 CXX test/cpp_headers/fuse_dispatcher.o 00:04:14.882 CXX test/cpp_headers/hexlify.o 00:04:14.882 CXX test/cpp_headers/fsdev.o 00:04:14.882 CXX test/cpp_headers/histogram_data.o 00:04:14.882 CXX test/cpp_headers/idxd.o 00:04:14.882 CXX test/cpp_headers/init.o 00:04:14.882 CXX test/cpp_headers/idxd_spec.o 00:04:14.882 CXX test/cpp_headers/ioat.o 00:04:14.882 CC examples/util/zipf/zipf.o 00:04:14.882 CXX test/cpp_headers/iscsi_spec.o 00:04:14.882 CXX test/cpp_headers/ioat_spec.o 00:04:14.882 CC test/thread/poller_perf/poller_perf.o 00:04:14.882 CXX test/cpp_headers/json.o 00:04:14.882 CXX test/cpp_headers/jsonrpc.o 00:04:14.882 CXX test/cpp_headers/keyring.o 00:04:14.882 CC test/dma/test_dma/test_dma.o 00:04:14.882 CC test/thread/lock/spdk_lock.o 00:04:14.882 CXX test/cpp_headers/keyring_module.o 00:04:14.882 CXX test/cpp_headers/likely.o 00:04:14.882 CXX test/cpp_headers/log.o 00:04:14.882 CC test/app/histogram_perf/histogram_perf.o 00:04:14.882 CXX test/cpp_headers/lvol.o 00:04:14.882 CXX test/cpp_headers/md5.o 00:04:14.882 CXX test/cpp_headers/mmio.o 00:04:14.882 CXX test/cpp_headers/memory.o 00:04:14.882 CXX test/cpp_headers/nbd.o 00:04:14.882 CXX test/cpp_headers/net.o 00:04:14.882 CXX test/cpp_headers/notify.o 00:04:14.882 CXX test/cpp_headers/nvme.o 00:04:14.882 CXX test/cpp_headers/nvme_intel.o 00:04:14.882 CXX test/cpp_headers/nvme_ocssd.o 00:04:14.882 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:14.882 CXX test/cpp_headers/nvme_spec.o 00:04:14.882 CXX test/cpp_headers/nvme_zns.o 00:04:14.882 CXX test/cpp_headers/nvmf_cmd.o 00:04:14.882 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:14.882 CXX test/cpp_headers/nvmf.o 00:04:14.882 CXX test/cpp_headers/nvmf_spec.o 00:04:14.882 CXX test/cpp_headers/nvmf_transport.o 00:04:14.882 CXX test/cpp_headers/opal.o 00:04:14.882 CXX test/cpp_headers/opal_spec.o 00:04:14.882 CXX test/cpp_headers/pci_ids.o 00:04:14.882 CXX test/cpp_headers/pipe.o 00:04:14.882 CXX test/cpp_headers/queue.o 00:04:14.882 CXX test/cpp_headers/reduce.o 00:04:14.882 CXX test/cpp_headers/rpc.o 00:04:14.882 CXX test/cpp_headers/scheduler.o 00:04:14.882 CXX test/cpp_headers/scsi.o 00:04:14.882 CXX test/cpp_headers/scsi_spec.o 00:04:14.882 CXX test/cpp_headers/sock.o 00:04:14.882 CXX test/cpp_headers/stdinc.o 00:04:14.882 CC test/app/jsoncat/jsoncat.o 00:04:14.882 LINK rpc_client_test 00:04:14.882 CC test/app/stub/stub.o 00:04:14.882 LINK spdk_nvme_discover 00:04:14.882 CXX test/cpp_headers/string.o 00:04:14.882 LINK spdk_trace_record 00:04:14.882 CC app/fio/bdev/fio_plugin.o 00:04:14.882 CXX test/cpp_headers/thread.o 00:04:14.882 CC test/app/bdev_svc/bdev_svc.o 00:04:14.882 CC test/env/mem_callbacks/mem_callbacks.o 00:04:14.882 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:14.882 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:14.882 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:14.882 LINK nvmf_tgt 00:04:14.882 LINK interrupt_tgt 00:04:14.882 LINK vtophys 00:04:14.882 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:14.882 CXX test/cpp_headers/trace.o 00:04:14.882 CXX test/cpp_headers/trace_parser.o 00:04:14.882 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:04:14.882 LINK iscsi_tgt 00:04:14.882 CXX test/cpp_headers/tree.o 00:04:15.141 CXX test/cpp_headers/ublk.o 00:04:15.141 LINK env_dpdk_post_init 00:04:15.141 CXX test/cpp_headers/util.o 00:04:15.141 CXX test/cpp_headers/uuid.o 00:04:15.141 LINK poller_perf 00:04:15.141 LINK histogram_perf 00:04:15.141 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:04:15.141 LINK zipf 00:04:15.141 CXX test/cpp_headers/version.o 00:04:15.141 CXX test/cpp_headers/vfio_user_pci.o 00:04:15.141 CXX test/cpp_headers/vfio_user_spec.o 00:04:15.141 CXX test/cpp_headers/vhost.o 00:04:15.141 CXX test/cpp_headers/vmd.o 00:04:15.141 CXX test/cpp_headers/xor.o 00:04:15.141 CXX test/cpp_headers/zipf.o 00:04:15.141 LINK jsoncat 00:04:15.141 LINK verify 00:04:15.141 LINK spdk_tgt 00:04:15.141 LINK ioat_perf 00:04:15.141 LINK stub 00:04:15.141 LINK spdk_trace 00:04:15.141 LINK bdev_svc 00:04:15.141 LINK spdk_dd 00:04:15.141 LINK test_dma 00:04:15.399 LINK pci_ut 00:04:15.399 LINK llvm_vfio_fuzz 00:04:15.399 LINK spdk_nvme 00:04:15.399 LINK vhost_fuzz 00:04:15.399 LINK nvme_fuzz 00:04:15.399 LINK spdk_nvme_perf 00:04:15.399 LINK spdk_nvme_identify 00:04:15.399 LINK mem_callbacks 00:04:15.399 LINK spdk_bdev 00:04:15.399 LINK llvm_nvme_fuzz 00:04:15.399 LINK spdk_top 00:04:15.657 CC app/vhost/vhost.o 00:04:15.657 CC examples/idxd/perf/perf.o 00:04:15.657 CC examples/sock/hello_world/hello_sock.o 00:04:15.657 CC examples/vmd/lsvmd/lsvmd.o 00:04:15.657 CC examples/vmd/led/led.o 00:04:15.657 CC examples/thread/thread/thread_ex.o 00:04:15.657 LINK memory_ut 00:04:15.657 LINK vhost 00:04:15.657 LINK led 00:04:15.657 LINK lsvmd 00:04:15.916 LINK hello_sock 00:04:15.916 LINK idxd_perf 00:04:15.916 LINK thread 00:04:15.916 LINK spdk_lock 00:04:16.174 LINK iscsi_fuzz 00:04:16.433 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:16.433 CC examples/nvme/hello_world/hello_world.o 00:04:16.433 CC examples/nvme/reconnect/reconnect.o 00:04:16.433 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:16.433 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:16.433 CC examples/nvme/abort/abort.o 00:04:16.433 CC examples/nvme/arbitration/arbitration.o 00:04:16.433 CC examples/nvme/hotplug/hotplug.o 00:04:16.691 CC test/event/reactor_perf/reactor_perf.o 00:04:16.691 CC test/event/event_perf/event_perf.o 00:04:16.691 CC test/event/reactor/reactor.o 00:04:16.691 CC test/event/scheduler/scheduler.o 00:04:16.691 CC test/event/app_repeat/app_repeat.o 00:04:16.691 LINK pmr_persistence 00:04:16.691 LINK hello_world 00:04:16.691 LINK cmb_copy 00:04:16.691 LINK reactor_perf 00:04:16.691 LINK hotplug 00:04:16.691 LINK event_perf 00:04:16.692 LINK reactor 00:04:16.692 LINK app_repeat 00:04:16.692 LINK reconnect 00:04:16.692 LINK abort 00:04:16.692 LINK arbitration 00:04:16.692 LINK scheduler 00:04:16.692 LINK nvme_manage 00:04:16.977 CC test/nvme/connect_stress/connect_stress.o 00:04:16.977 CC test/nvme/fdp/fdp.o 00:04:16.977 CC test/nvme/e2edp/nvme_dp.o 00:04:16.977 CC test/nvme/reset/reset.o 00:04:16.977 CC test/nvme/overhead/overhead.o 00:04:16.977 CC test/nvme/fused_ordering/fused_ordering.o 00:04:16.977 CC test/nvme/boot_partition/boot_partition.o 00:04:16.977 CC test/nvme/simple_copy/simple_copy.o 00:04:16.977 CC test/nvme/reserve/reserve.o 00:04:16.977 CC test/nvme/sgl/sgl.o 00:04:16.977 CC test/nvme/err_injection/err_injection.o 00:04:16.977 CC test/nvme/startup/startup.o 00:04:16.977 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:16.977 CC test/nvme/cuse/cuse.o 00:04:16.977 CC test/nvme/aer/aer.o 00:04:16.977 CC test/nvme/compliance/nvme_compliance.o 00:04:16.977 CC test/accel/dif/dif.o 00:04:16.977 CC test/blobfs/mkfs/mkfs.o 00:04:16.977 CC test/lvol/esnap/esnap.o 00:04:16.977 LINK connect_stress 00:04:16.977 LINK boot_partition 00:04:16.977 LINK startup 00:04:16.977 LINK err_injection 00:04:16.977 LINK fused_ordering 00:04:16.977 LINK doorbell_aers 00:04:16.977 LINK reserve 00:04:16.977 LINK simple_copy 00:04:16.977 LINK nvme_dp 00:04:16.977 LINK fdp 00:04:16.977 LINK reset 00:04:16.977 LINK aer 00:04:16.977 LINK sgl 00:04:16.977 LINK overhead 00:04:17.235 LINK mkfs 00:04:17.235 LINK nvme_compliance 00:04:17.493 LINK dif 00:04:17.493 CC examples/accel/perf/accel_perf.o 00:04:17.493 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:17.493 CC examples/blob/cli/blobcli.o 00:04:17.493 CC examples/blob/hello_world/hello_blob.o 00:04:17.750 LINK cuse 00:04:17.750 LINK hello_blob 00:04:17.750 LINK hello_fsdev 00:04:18.009 LINK accel_perf 00:04:18.009 LINK blobcli 00:04:18.576 CC examples/bdev/hello_world/hello_bdev.o 00:04:18.576 CC examples/bdev/bdevperf/bdevperf.o 00:04:18.834 LINK hello_bdev 00:04:19.092 CC test/bdev/bdevio/bdevio.o 00:04:19.092 LINK bdevperf 00:04:19.350 LINK bdevio 00:04:20.286 LINK esnap 00:04:20.543 CC examples/nvmf/nvmf/nvmf.o 00:04:20.801 LINK nvmf 00:04:22.175 00:04:22.175 real 0m36.334s 00:04:22.175 user 4m38.897s 00:04:22.175 sys 1m40.951s 00:04:22.175 11:59:50 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:04:22.175 11:59:50 make -- common/autotest_common.sh@10 -- $ set +x 00:04:22.175 ************************************ 00:04:22.175 END TEST make 00:04:22.175 ************************************ 00:04:22.175 11:59:50 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:22.175 11:59:50 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:22.175 11:59:50 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:22.175 11:59:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.175 11:59:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:22.175 11:59:50 -- pm/common@44 -- $ pid=1581296 00:04:22.175 11:59:50 -- pm/common@50 -- $ kill -TERM 1581296 00:04:22.175 11:59:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.175 11:59:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:22.175 11:59:50 -- pm/common@44 -- $ pid=1581298 00:04:22.175 11:59:50 -- pm/common@50 -- $ kill -TERM 1581298 00:04:22.175 11:59:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.175 11:59:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:22.175 11:59:50 -- pm/common@44 -- $ pid=1581300 00:04:22.175 11:59:50 -- pm/common@50 -- $ kill -TERM 1581300 00:04:22.175 11:59:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.175 11:59:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:22.175 11:59:50 -- pm/common@44 -- $ pid=1581325 00:04:22.175 11:59:50 -- pm/common@50 -- $ sudo -E kill -TERM 1581325 00:04:22.175 11:59:51 -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:22.176 11:59:51 -- common/autotest_common.sh@1681 -- # lcov --version 00:04:22.176 11:59:51 -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:22.434 11:59:51 -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:22.434 11:59:51 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:22.434 11:59:51 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:22.434 11:59:51 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:22.434 11:59:51 -- scripts/common.sh@336 -- # IFS=.-: 00:04:22.434 11:59:51 -- scripts/common.sh@336 -- # read -ra ver1 00:04:22.434 11:59:51 -- scripts/common.sh@337 -- # IFS=.-: 00:04:22.434 11:59:51 -- scripts/common.sh@337 -- # read -ra ver2 00:04:22.434 11:59:51 -- scripts/common.sh@338 -- # local 'op=<' 00:04:22.434 11:59:51 -- scripts/common.sh@340 -- # ver1_l=2 00:04:22.434 11:59:51 -- scripts/common.sh@341 -- # ver2_l=1 00:04:22.434 11:59:51 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:22.434 11:59:51 -- scripts/common.sh@344 -- # case "$op" in 00:04:22.434 11:59:51 -- scripts/common.sh@345 -- # : 1 00:04:22.434 11:59:51 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:22.434 11:59:51 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:22.434 11:59:51 -- scripts/common.sh@365 -- # decimal 1 00:04:22.434 11:59:51 -- scripts/common.sh@353 -- # local d=1 00:04:22.434 11:59:51 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:22.434 11:59:51 -- scripts/common.sh@355 -- # echo 1 00:04:22.434 11:59:51 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:22.434 11:59:51 -- scripts/common.sh@366 -- # decimal 2 00:04:22.434 11:59:51 -- scripts/common.sh@353 -- # local d=2 00:04:22.434 11:59:51 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:22.434 11:59:51 -- scripts/common.sh@355 -- # echo 2 00:04:22.434 11:59:51 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:22.434 11:59:51 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:22.434 11:59:51 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:22.434 11:59:51 -- scripts/common.sh@368 -- # return 0 00:04:22.434 11:59:51 -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:22.434 11:59:51 -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:22.434 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:22.434 --rc genhtml_branch_coverage=1 00:04:22.434 --rc genhtml_function_coverage=1 00:04:22.434 --rc genhtml_legend=1 00:04:22.434 --rc geninfo_all_blocks=1 00:04:22.434 --rc geninfo_unexecuted_blocks=1 00:04:22.434 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:22.434 ' 00:04:22.434 11:59:51 -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:22.434 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:22.434 --rc genhtml_branch_coverage=1 00:04:22.434 --rc genhtml_function_coverage=1 00:04:22.434 --rc genhtml_legend=1 00:04:22.434 --rc geninfo_all_blocks=1 00:04:22.434 --rc geninfo_unexecuted_blocks=1 00:04:22.434 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:22.434 ' 00:04:22.434 11:59:51 -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:22.434 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:22.434 --rc genhtml_branch_coverage=1 00:04:22.434 --rc genhtml_function_coverage=1 00:04:22.434 --rc genhtml_legend=1 00:04:22.434 --rc geninfo_all_blocks=1 00:04:22.434 --rc geninfo_unexecuted_blocks=1 00:04:22.434 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:22.434 ' 00:04:22.434 11:59:51 -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:22.434 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:22.434 --rc genhtml_branch_coverage=1 00:04:22.434 --rc genhtml_function_coverage=1 00:04:22.434 --rc genhtml_legend=1 00:04:22.434 --rc geninfo_all_blocks=1 00:04:22.434 --rc geninfo_unexecuted_blocks=1 00:04:22.434 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:22.434 ' 00:04:22.434 11:59:51 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:22.434 11:59:51 -- nvmf/common.sh@7 -- # uname -s 00:04:22.434 11:59:51 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:22.434 11:59:51 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:22.434 11:59:51 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:22.434 11:59:51 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:22.434 11:59:51 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:22.434 11:59:51 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:22.434 11:59:51 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:22.434 11:59:51 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:22.434 11:59:51 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:22.434 11:59:51 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:22.434 11:59:51 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:22.434 11:59:51 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:22.434 11:59:51 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:22.434 11:59:51 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:22.434 11:59:51 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:22.434 11:59:51 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:22.434 11:59:51 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:22.434 11:59:51 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:22.434 11:59:51 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:22.434 11:59:51 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:22.434 11:59:51 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:22.434 11:59:51 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:22.434 11:59:51 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:22.434 11:59:51 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:22.434 11:59:51 -- paths/export.sh@5 -- # export PATH 00:04:22.434 11:59:51 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:22.434 11:59:51 -- nvmf/common.sh@51 -- # : 0 00:04:22.434 11:59:51 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:22.434 11:59:51 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:22.434 11:59:51 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:22.434 11:59:51 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:22.434 11:59:51 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:22.434 11:59:51 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:22.434 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:22.434 11:59:51 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:22.434 11:59:51 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:22.434 11:59:51 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:22.434 11:59:51 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:22.434 11:59:51 -- spdk/autotest.sh@32 -- # uname -s 00:04:22.434 11:59:51 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:22.434 11:59:51 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:22.434 11:59:51 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:22.434 11:59:51 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:22.434 11:59:51 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:22.434 11:59:51 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:22.434 11:59:51 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:22.434 11:59:51 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:22.434 11:59:51 -- spdk/autotest.sh@48 -- # udevadm_pid=1659652 00:04:22.434 11:59:51 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:22.434 11:59:51 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:22.434 11:59:51 -- pm/common@17 -- # local monitor 00:04:22.434 11:59:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.434 11:59:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.434 11:59:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.434 11:59:51 -- pm/common@21 -- # date +%s 00:04:22.434 11:59:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:22.434 11:59:51 -- pm/common@21 -- # date +%s 00:04:22.434 11:59:51 -- pm/common@25 -- # sleep 1 00:04:22.434 11:59:51 -- pm/common@21 -- # date +%s 00:04:22.434 11:59:51 -- pm/common@21 -- # date +%s 00:04:22.434 11:59:51 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732705191 00:04:22.435 11:59:51 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732705191 00:04:22.435 11:59:51 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732705191 00:04:22.435 11:59:51 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732705191 00:04:22.435 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732705191_collect-cpu-temp.pm.log 00:04:22.435 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732705191_collect-cpu-load.pm.log 00:04:22.435 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732705191_collect-vmstat.pm.log 00:04:22.435 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732705191_collect-bmc-pm.bmc.pm.log 00:04:23.366 11:59:52 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:23.366 11:59:52 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:23.366 11:59:52 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:23.366 11:59:52 -- common/autotest_common.sh@10 -- # set +x 00:04:23.366 11:59:52 -- spdk/autotest.sh@59 -- # create_test_list 00:04:23.366 11:59:52 -- common/autotest_common.sh@748 -- # xtrace_disable 00:04:23.366 11:59:52 -- common/autotest_common.sh@10 -- # set +x 00:04:23.366 11:59:52 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:23.366 11:59:52 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:23.366 11:59:52 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:23.366 11:59:52 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:23.366 11:59:52 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:23.366 11:59:52 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:23.366 11:59:52 -- common/autotest_common.sh@1455 -- # uname 00:04:23.366 11:59:52 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:04:23.366 11:59:52 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:23.366 11:59:52 -- common/autotest_common.sh@1475 -- # uname 00:04:23.623 11:59:52 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:04:23.623 11:59:52 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:23.623 11:59:52 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:23.623 lcov: LCOV version 1.15 00:04:23.623 11:59:52 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:28.883 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:35.437 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:39.618 12:00:07 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:39.619 12:00:07 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:39.619 12:00:07 -- common/autotest_common.sh@10 -- # set +x 00:04:39.619 12:00:07 -- spdk/autotest.sh@78 -- # rm -f 00:04:39.619 12:00:07 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:42.239 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:42.239 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:42.499 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:42.499 12:00:11 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:42.499 12:00:11 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:42.499 12:00:11 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:42.499 12:00:11 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:42.499 12:00:11 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:42.499 12:00:11 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:42.499 12:00:11 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:42.499 12:00:11 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:42.499 12:00:11 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:42.499 12:00:11 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:42.499 12:00:11 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:42.499 12:00:11 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:42.499 12:00:11 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:42.499 12:00:11 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:42.499 12:00:11 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:42.760 No valid GPT data, bailing 00:04:42.760 12:00:11 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:42.760 12:00:11 -- scripts/common.sh@394 -- # pt= 00:04:42.760 12:00:11 -- scripts/common.sh@395 -- # return 1 00:04:42.760 12:00:11 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:42.760 1+0 records in 00:04:42.760 1+0 records out 00:04:42.760 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00486777 s, 215 MB/s 00:04:42.760 12:00:11 -- spdk/autotest.sh@105 -- # sync 00:04:42.760 12:00:11 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:42.760 12:00:11 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:42.760 12:00:11 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:50.878 12:00:18 -- spdk/autotest.sh@111 -- # uname -s 00:04:50.878 12:00:18 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:50.878 12:00:18 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:50.878 12:00:18 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:50.878 12:00:18 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:50.878 12:00:18 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:50.878 12:00:18 -- common/autotest_common.sh@10 -- # set +x 00:04:50.878 ************************************ 00:04:50.878 START TEST setup.sh 00:04:50.878 ************************************ 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:50.878 * Looking for test storage... 00:04:50.878 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1681 -- # lcov --version 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:50.878 12:00:18 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:50.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.878 --rc genhtml_branch_coverage=1 00:04:50.878 --rc genhtml_function_coverage=1 00:04:50.878 --rc genhtml_legend=1 00:04:50.878 --rc geninfo_all_blocks=1 00:04:50.878 --rc geninfo_unexecuted_blocks=1 00:04:50.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.878 ' 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:50.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.878 --rc genhtml_branch_coverage=1 00:04:50.878 --rc genhtml_function_coverage=1 00:04:50.878 --rc genhtml_legend=1 00:04:50.878 --rc geninfo_all_blocks=1 00:04:50.878 --rc geninfo_unexecuted_blocks=1 00:04:50.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.878 ' 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:50.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.878 --rc genhtml_branch_coverage=1 00:04:50.878 --rc genhtml_function_coverage=1 00:04:50.878 --rc genhtml_legend=1 00:04:50.878 --rc geninfo_all_blocks=1 00:04:50.878 --rc geninfo_unexecuted_blocks=1 00:04:50.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.878 ' 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:50.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.878 --rc genhtml_branch_coverage=1 00:04:50.878 --rc genhtml_function_coverage=1 00:04:50.878 --rc genhtml_legend=1 00:04:50.878 --rc geninfo_all_blocks=1 00:04:50.878 --rc geninfo_unexecuted_blocks=1 00:04:50.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.878 ' 00:04:50.878 12:00:18 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:50.878 12:00:18 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:50.878 12:00:18 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:50.878 12:00:18 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:50.878 ************************************ 00:04:50.878 START TEST acl 00:04:50.878 ************************************ 00:04:50.878 12:00:18 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:50.878 * Looking for test storage... 00:04:50.878 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:50.878 12:00:18 setup.sh.acl -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:50.878 12:00:18 setup.sh.acl -- common/autotest_common.sh@1681 -- # lcov --version 00:04:50.878 12:00:18 setup.sh.acl -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:50.878 12:00:19 setup.sh.acl -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:50.878 12:00:19 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:50.879 12:00:19 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:50.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.879 --rc genhtml_branch_coverage=1 00:04:50.879 --rc genhtml_function_coverage=1 00:04:50.879 --rc genhtml_legend=1 00:04:50.879 --rc geninfo_all_blocks=1 00:04:50.879 --rc geninfo_unexecuted_blocks=1 00:04:50.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.879 ' 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:50.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.879 --rc genhtml_branch_coverage=1 00:04:50.879 --rc genhtml_function_coverage=1 00:04:50.879 --rc genhtml_legend=1 00:04:50.879 --rc geninfo_all_blocks=1 00:04:50.879 --rc geninfo_unexecuted_blocks=1 00:04:50.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.879 ' 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:50.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.879 --rc genhtml_branch_coverage=1 00:04:50.879 --rc genhtml_function_coverage=1 00:04:50.879 --rc genhtml_legend=1 00:04:50.879 --rc geninfo_all_blocks=1 00:04:50.879 --rc geninfo_unexecuted_blocks=1 00:04:50.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.879 ' 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:50.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.879 --rc genhtml_branch_coverage=1 00:04:50.879 --rc genhtml_function_coverage=1 00:04:50.879 --rc genhtml_legend=1 00:04:50.879 --rc geninfo_all_blocks=1 00:04:50.879 --rc geninfo_unexecuted_blocks=1 00:04:50.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.879 ' 00:04:50.879 12:00:19 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:50.879 12:00:19 setup.sh.acl -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:50.879 12:00:19 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:50.879 12:00:19 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:50.879 12:00:19 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:50.879 12:00:19 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:50.879 12:00:19 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:50.879 12:00:19 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:50.879 12:00:19 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:54.163 12:00:22 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:54.163 12:00:22 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:54.163 12:00:22 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.163 12:00:22 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:54.163 12:00:22 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.163 12:00:22 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:56.687 Hugepages 00:04:56.687 node hugesize free / total 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 00:04:56.687 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.687 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:56.945 12:00:25 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:56.945 12:00:25 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:56.945 12:00:25 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:56.945 12:00:25 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:56.945 ************************************ 00:04:56.945 START TEST denied 00:04:56.945 ************************************ 00:04:56.945 12:00:25 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:56.945 12:00:25 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:56.945 12:00:25 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:56.945 12:00:25 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:56.945 12:00:25 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.945 12:00:25 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:01.177 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:01.177 12:00:29 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:05.363 00:05:05.363 real 0m8.393s 00:05:05.363 user 0m2.690s 00:05:05.363 sys 0m5.047s 00:05:05.363 12:00:34 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:05.363 12:00:34 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:05.363 ************************************ 00:05:05.363 END TEST denied 00:05:05.363 ************************************ 00:05:05.363 12:00:34 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:05.363 12:00:34 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:05.363 12:00:34 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:05.363 12:00:34 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:05.363 ************************************ 00:05:05.363 START TEST allowed 00:05:05.363 ************************************ 00:05:05.363 12:00:34 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:05:05.363 12:00:34 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:05:05.363 12:00:34 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:05.363 12:00:34 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:05:05.363 12:00:34 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:05.363 12:00:34 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:10.629 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:10.629 12:00:39 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:10.629 12:00:39 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:10.629 12:00:39 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:10.629 12:00:39 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:10.629 12:00:39 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:14.816 00:05:14.816 real 0m8.793s 00:05:14.816 user 0m2.513s 00:05:14.816 sys 0m4.810s 00:05:14.816 12:00:43 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:14.816 12:00:43 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:14.816 ************************************ 00:05:14.816 END TEST allowed 00:05:14.816 ************************************ 00:05:14.816 00:05:14.816 real 0m24.220s 00:05:14.816 user 0m7.597s 00:05:14.816 sys 0m14.598s 00:05:14.816 12:00:43 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:14.816 12:00:43 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:14.816 ************************************ 00:05:14.816 END TEST acl 00:05:14.816 ************************************ 00:05:14.816 12:00:43 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:14.816 12:00:43 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:14.816 12:00:43 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:14.816 12:00:43 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:14.816 ************************************ 00:05:14.816 START TEST hugepages 00:05:14.816 ************************************ 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:14.816 * Looking for test storage... 00:05:14.816 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lcov --version 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:14.816 12:00:43 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:14.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.816 --rc genhtml_branch_coverage=1 00:05:14.816 --rc genhtml_function_coverage=1 00:05:14.816 --rc genhtml_legend=1 00:05:14.816 --rc geninfo_all_blocks=1 00:05:14.816 --rc geninfo_unexecuted_blocks=1 00:05:14.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.816 ' 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:14.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.816 --rc genhtml_branch_coverage=1 00:05:14.816 --rc genhtml_function_coverage=1 00:05:14.816 --rc genhtml_legend=1 00:05:14.816 --rc geninfo_all_blocks=1 00:05:14.816 --rc geninfo_unexecuted_blocks=1 00:05:14.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.816 ' 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:14.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.816 --rc genhtml_branch_coverage=1 00:05:14.816 --rc genhtml_function_coverage=1 00:05:14.816 --rc genhtml_legend=1 00:05:14.816 --rc geninfo_all_blocks=1 00:05:14.816 --rc geninfo_unexecuted_blocks=1 00:05:14.816 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.816 ' 00:05:14.816 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:14.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.816 --rc genhtml_branch_coverage=1 00:05:14.816 --rc genhtml_function_coverage=1 00:05:14.816 --rc genhtml_legend=1 00:05:14.817 --rc geninfo_all_blocks=1 00:05:14.817 --rc geninfo_unexecuted_blocks=1 00:05:14.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.817 ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39466640 kB' 'MemAvailable: 41091220 kB' 'Buffers: 4076 kB' 'Cached: 11289080 kB' 'SwapCached: 76 kB' 'Active: 8697460 kB' 'Inactive: 3189020 kB' 'Active(anon): 7790304 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596608 kB' 'Mapped: 143960 kB' 'Shmem: 9548052 kB' 'KReclaimable: 576276 kB' 'Slab: 1578896 kB' 'SReclaimable: 576276 kB' 'SUnreclaim: 1002620 kB' 'KernelStack: 21856 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 12074572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217860 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.817 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.818 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:14.819 12:00:43 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:05:14.819 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:14.819 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:14.819 12:00:43 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:14.819 ************************************ 00:05:14.819 START TEST single_node_setup 00:05:14.819 ************************************ 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1125 -- # single_node_setup 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.819 12:00:43 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:18.102 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:18.102 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:19.482 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41622264 kB' 'MemAvailable: 43246884 kB' 'Buffers: 4076 kB' 'Cached: 11289212 kB' 'SwapCached: 76 kB' 'Active: 8699224 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792068 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598168 kB' 'Mapped: 144272 kB' 'Shmem: 9548184 kB' 'KReclaimable: 576316 kB' 'Slab: 1576684 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000368 kB' 'KernelStack: 21952 kB' 'PageTables: 8400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12075400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.482 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.483 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41623664 kB' 'MemAvailable: 43248284 kB' 'Buffers: 4076 kB' 'Cached: 11289212 kB' 'SwapCached: 76 kB' 'Active: 8699336 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792180 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598180 kB' 'Mapped: 144236 kB' 'Shmem: 9548184 kB' 'KReclaimable: 576316 kB' 'Slab: 1576684 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000368 kB' 'KernelStack: 21920 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12075416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.484 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.485 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41623340 kB' 'MemAvailable: 43247960 kB' 'Buffers: 4076 kB' 'Cached: 11289236 kB' 'SwapCached: 76 kB' 'Active: 8699476 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792320 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598256 kB' 'Mapped: 144740 kB' 'Shmem: 9548208 kB' 'KReclaimable: 576316 kB' 'Slab: 1576676 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000360 kB' 'KernelStack: 21936 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12076664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.486 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.487 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.749 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:19.750 nr_hugepages=1024 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:19.750 resv_hugepages=0 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:19.750 surplus_hugepages=0 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:19.750 anon_hugepages=0 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41616624 kB' 'MemAvailable: 43241244 kB' 'Buffers: 4076 kB' 'Cached: 11289256 kB' 'SwapCached: 76 kB' 'Active: 8705584 kB' 'Inactive: 3189020 kB' 'Active(anon): 7798428 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604428 kB' 'Mapped: 144740 kB' 'Shmem: 9548228 kB' 'KReclaimable: 576316 kB' 'Slab: 1576580 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000264 kB' 'KernelStack: 21936 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12081580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218040 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.750 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.751 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.752 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 21705668 kB' 'MemUsed: 10928768 kB' 'SwapCached: 44 kB' 'Active: 5918176 kB' 'Inactive: 532564 kB' 'Active(anon): 5140748 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6166160 kB' 'Mapped: 87876 kB' 'AnonPages: 287764 kB' 'Shmem: 4856180 kB' 'KernelStack: 11112 kB' 'PageTables: 5008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 402364 kB' 'Slab: 886340 kB' 'SReclaimable: 402364 kB' 'SUnreclaim: 483976 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.753 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:19.754 node0=1024 expecting 1024 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:19.754 00:05:19.754 real 0m5.009s 00:05:19.754 user 0m1.232s 00:05:19.754 sys 0m2.237s 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:19.754 12:00:48 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:19.754 ************************************ 00:05:19.754 END TEST single_node_setup 00:05:19.754 ************************************ 00:05:19.754 12:00:48 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:19.754 12:00:48 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:19.754 12:00:48 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:19.754 12:00:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:19.754 ************************************ 00:05:19.754 START TEST even_2G_alloc 00:05:19.754 ************************************ 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:19.754 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:19.755 12:00:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:23.052 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:23.052 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:23.052 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:23.052 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:23.053 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41650176 kB' 'MemAvailable: 43274796 kB' 'Buffers: 4076 kB' 'Cached: 11289376 kB' 'SwapCached: 76 kB' 'Active: 8698772 kB' 'Inactive: 3189020 kB' 'Active(anon): 7791616 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597552 kB' 'Mapped: 144324 kB' 'Shmem: 9548348 kB' 'KReclaimable: 576316 kB' 'Slab: 1577068 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000752 kB' 'KernelStack: 22064 kB' 'PageTables: 8868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12076284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218260 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.053 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.054 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41653712 kB' 'MemAvailable: 43278332 kB' 'Buffers: 4076 kB' 'Cached: 11289380 kB' 'SwapCached: 76 kB' 'Active: 8700732 kB' 'Inactive: 3189020 kB' 'Active(anon): 7793576 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599496 kB' 'Mapped: 144696 kB' 'Shmem: 9548352 kB' 'KReclaimable: 576316 kB' 'Slab: 1577012 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000696 kB' 'KernelStack: 21888 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12079904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.055 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.056 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.057 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41651192 kB' 'MemAvailable: 43275812 kB' 'Buffers: 4076 kB' 'Cached: 11289380 kB' 'SwapCached: 76 kB' 'Active: 8703796 kB' 'Inactive: 3189020 kB' 'Active(anon): 7796640 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602660 kB' 'Mapped: 145044 kB' 'Shmem: 9548352 kB' 'KReclaimable: 576316 kB' 'Slab: 1576908 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000592 kB' 'KernelStack: 21952 kB' 'PageTables: 8492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12082444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218200 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.058 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.059 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:23.060 nr_hugepages=1024 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:23.060 resv_hugepages=0 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:23.060 surplus_hugepages=0 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:23.060 anon_hugepages=0 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41642536 kB' 'MemAvailable: 43267156 kB' 'Buffers: 4076 kB' 'Cached: 11289420 kB' 'SwapCached: 76 kB' 'Active: 8703556 kB' 'Inactive: 3189020 kB' 'Active(anon): 7796400 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602284 kB' 'Mapped: 144696 kB' 'Shmem: 9548392 kB' 'KReclaimable: 576316 kB' 'Slab: 1576908 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000592 kB' 'KernelStack: 21920 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12082468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218120 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.060 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.061 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:23.062 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22757792 kB' 'MemUsed: 9876644 kB' 'SwapCached: 44 kB' 'Active: 5913240 kB' 'Inactive: 532564 kB' 'Active(anon): 5135812 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6166264 kB' 'Mapped: 87380 kB' 'AnonPages: 282736 kB' 'Shmem: 4856284 kB' 'KernelStack: 11080 kB' 'PageTables: 4960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 402364 kB' 'Slab: 886652 kB' 'SReclaimable: 402364 kB' 'SUnreclaim: 484288 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.063 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:23.064 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18885852 kB' 'MemUsed: 8763508 kB' 'SwapCached: 32 kB' 'Active: 2784656 kB' 'Inactive: 2656456 kB' 'Active(anon): 2654928 kB' 'Inactive(anon): 2351016 kB' 'Active(file): 129728 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5127324 kB' 'Mapped: 56812 kB' 'AnonPages: 313852 kB' 'Shmem: 4692124 kB' 'KernelStack: 10952 kB' 'PageTables: 3624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 173952 kB' 'Slab: 690256 kB' 'SReclaimable: 173952 kB' 'SUnreclaim: 516304 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.065 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:23.066 node0=512 expecting 512 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:23.066 node1=512 expecting 512 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:23.066 00:05:23.066 real 0m3.351s 00:05:23.066 user 0m1.176s 00:05:23.066 sys 0m2.197s 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:23.066 12:00:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:23.066 ************************************ 00:05:23.066 END TEST even_2G_alloc 00:05:23.066 ************************************ 00:05:23.066 12:00:51 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:23.066 12:00:51 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:23.066 12:00:51 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:23.066 12:00:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:23.326 ************************************ 00:05:23.326 START TEST odd_alloc 00:05:23.326 ************************************ 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:23.326 12:00:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:26.623 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:26.623 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41640404 kB' 'MemAvailable: 43265024 kB' 'Buffers: 4076 kB' 'Cached: 11289548 kB' 'SwapCached: 76 kB' 'Active: 8699012 kB' 'Inactive: 3189020 kB' 'Active(anon): 7791856 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597080 kB' 'Mapped: 143160 kB' 'Shmem: 9548520 kB' 'KReclaimable: 576316 kB' 'Slab: 1577052 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000736 kB' 'KernelStack: 21872 kB' 'PageTables: 8464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12066876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.623 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.624 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41641888 kB' 'MemAvailable: 43266508 kB' 'Buffers: 4076 kB' 'Cached: 11289564 kB' 'SwapCached: 76 kB' 'Active: 8697672 kB' 'Inactive: 3189020 kB' 'Active(anon): 7790516 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596192 kB' 'Mapped: 143048 kB' 'Shmem: 9548536 kB' 'KReclaimable: 576316 kB' 'Slab: 1577028 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000712 kB' 'KernelStack: 21824 kB' 'PageTables: 8296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12066896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.625 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.626 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:26.627 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41643612 kB' 'MemAvailable: 43268232 kB' 'Buffers: 4076 kB' 'Cached: 11289568 kB' 'SwapCached: 76 kB' 'Active: 8699448 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792292 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598000 kB' 'Mapped: 143552 kB' 'Shmem: 9548540 kB' 'KReclaimable: 576316 kB' 'Slab: 1577004 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000688 kB' 'KernelStack: 21824 kB' 'PageTables: 8300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12069196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.628 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.629 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:26.630 nr_hugepages=1025 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:26.630 resv_hugepages=0 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:26.630 surplus_hugepages=0 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:26.630 anon_hugepages=0 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41638956 kB' 'MemAvailable: 43263576 kB' 'Buffers: 4076 kB' 'Cached: 11289608 kB' 'SwapCached: 76 kB' 'Active: 8697760 kB' 'Inactive: 3189020 kB' 'Active(anon): 7790604 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596320 kB' 'Mapped: 143380 kB' 'Shmem: 9548580 kB' 'KReclaimable: 576316 kB' 'Slab: 1577004 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000688 kB' 'KernelStack: 21840 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12067952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.630 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.631 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22759004 kB' 'MemUsed: 9875432 kB' 'SwapCached: 44 kB' 'Active: 5912964 kB' 'Inactive: 532564 kB' 'Active(anon): 5135536 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6166404 kB' 'Mapped: 87088 kB' 'AnonPages: 282364 kB' 'Shmem: 4856424 kB' 'KernelStack: 11096 kB' 'PageTables: 4856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 402364 kB' 'Slab: 886712 kB' 'SReclaimable: 402364 kB' 'SUnreclaim: 484348 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.632 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.633 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18871132 kB' 'MemUsed: 8778228 kB' 'SwapCached: 32 kB' 'Active: 2787808 kB' 'Inactive: 2656456 kB' 'Active(anon): 2658080 kB' 'Inactive(anon): 2351016 kB' 'Active(file): 129728 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5127356 kB' 'Mapped: 56616 kB' 'AnonPages: 314460 kB' 'Shmem: 4692156 kB' 'KernelStack: 10728 kB' 'PageTables: 3468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 173952 kB' 'Slab: 690292 kB' 'SReclaimable: 173952 kB' 'SUnreclaim: 516340 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.634 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.635 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:26.636 node0=513 expecting 513 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:26.636 node1=512 expecting 512 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:26.636 00:05:26.636 real 0m3.308s 00:05:26.636 user 0m1.224s 00:05:26.636 sys 0m2.088s 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:26.636 12:00:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:26.636 ************************************ 00:05:26.636 END TEST odd_alloc 00:05:26.636 ************************************ 00:05:26.636 12:00:55 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:26.636 12:00:55 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:26.636 12:00:55 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:26.636 12:00:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:26.636 ************************************ 00:05:26.636 START TEST custom_alloc 00:05:26.636 ************************************ 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:26.636 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:26.637 12:00:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:29.949 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.949 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40593564 kB' 'MemAvailable: 42218184 kB' 'Buffers: 4076 kB' 'Cached: 11289724 kB' 'SwapCached: 76 kB' 'Active: 8698680 kB' 'Inactive: 3189020 kB' 'Active(anon): 7791524 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597132 kB' 'Mapped: 143164 kB' 'Shmem: 9548696 kB' 'KReclaimable: 576316 kB' 'Slab: 1577676 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1001360 kB' 'KernelStack: 21856 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12067572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.949 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.950 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40593332 kB' 'MemAvailable: 42217952 kB' 'Buffers: 4076 kB' 'Cached: 11289744 kB' 'SwapCached: 76 kB' 'Active: 8698656 kB' 'Inactive: 3189020 kB' 'Active(anon): 7791500 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597084 kB' 'Mapped: 143052 kB' 'Shmem: 9548716 kB' 'KReclaimable: 576316 kB' 'Slab: 1577676 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1001360 kB' 'KernelStack: 21824 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12067592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.951 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.952 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40590060 kB' 'MemAvailable: 42214680 kB' 'Buffers: 4076 kB' 'Cached: 11289744 kB' 'SwapCached: 76 kB' 'Active: 8698740 kB' 'Inactive: 3189020 kB' 'Active(anon): 7791584 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597148 kB' 'Mapped: 143052 kB' 'Shmem: 9548716 kB' 'KReclaimable: 576316 kB' 'Slab: 1577676 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1001360 kB' 'KernelStack: 21840 kB' 'PageTables: 8340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12067612 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.953 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.216 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.217 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:30.218 nr_hugepages=1536 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:30.218 resv_hugepages=0 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:30.218 surplus_hugepages=0 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:30.218 anon_hugepages=0 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40591008 kB' 'MemAvailable: 42215628 kB' 'Buffers: 4076 kB' 'Cached: 11289784 kB' 'SwapCached: 76 kB' 'Active: 8698408 kB' 'Inactive: 3189020 kB' 'Active(anon): 7791252 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 596732 kB' 'Mapped: 143052 kB' 'Shmem: 9548756 kB' 'KReclaimable: 576316 kB' 'Slab: 1577676 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1001360 kB' 'KernelStack: 21824 kB' 'PageTables: 8292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12067632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.218 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:30.219 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22770036 kB' 'MemUsed: 9864400 kB' 'SwapCached: 44 kB' 'Active: 5914736 kB' 'Inactive: 532564 kB' 'Active(anon): 5137308 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6166564 kB' 'Mapped: 86588 kB' 'AnonPages: 283988 kB' 'Shmem: 4856584 kB' 'KernelStack: 11128 kB' 'PageTables: 4944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 402364 kB' 'Slab: 887056 kB' 'SReclaimable: 402364 kB' 'SUnreclaim: 484692 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.220 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17821392 kB' 'MemUsed: 9827968 kB' 'SwapCached: 32 kB' 'Active: 2784048 kB' 'Inactive: 2656456 kB' 'Active(anon): 2654320 kB' 'Inactive(anon): 2351016 kB' 'Active(file): 129728 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5127376 kB' 'Mapped: 56464 kB' 'AnonPages: 313156 kB' 'Shmem: 4692176 kB' 'KernelStack: 10712 kB' 'PageTables: 3396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 173952 kB' 'Slab: 690620 kB' 'SReclaimable: 173952 kB' 'SUnreclaim: 516668 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.221 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:30.222 node0=512 expecting 512 00:05:30.222 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:30.223 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:30.223 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:30.223 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:30.223 node1=1024 expecting 1024 00:05:30.223 12:00:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:30.223 00:05:30.223 real 0m3.609s 00:05:30.223 user 0m1.345s 00:05:30.223 sys 0m2.331s 00:05:30.223 12:00:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:30.223 12:00:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:30.223 ************************************ 00:05:30.223 END TEST custom_alloc 00:05:30.223 ************************************ 00:05:30.223 12:00:58 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:30.223 12:00:58 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:30.223 12:00:58 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:30.223 12:00:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:30.223 ************************************ 00:05:30.223 START TEST no_shrink_alloc 00:05:30.223 ************************************ 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:30.223 12:00:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:33.515 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:33.515 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41638104 kB' 'MemAvailable: 43262724 kB' 'Buffers: 4076 kB' 'Cached: 11289892 kB' 'SwapCached: 76 kB' 'Active: 8700260 kB' 'Inactive: 3189020 kB' 'Active(anon): 7793104 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598416 kB' 'Mapped: 143108 kB' 'Shmem: 9548864 kB' 'KReclaimable: 576316 kB' 'Slab: 1577128 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000812 kB' 'KernelStack: 21968 kB' 'PageTables: 8772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12070892 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218260 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.779 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.780 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41643320 kB' 'MemAvailable: 43267940 kB' 'Buffers: 4076 kB' 'Cached: 11289892 kB' 'SwapCached: 76 kB' 'Active: 8699208 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792052 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597460 kB' 'Mapped: 143152 kB' 'Shmem: 9548864 kB' 'KReclaimable: 576316 kB' 'Slab: 1577140 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000824 kB' 'KernelStack: 21936 kB' 'PageTables: 8568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12069136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.781 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.782 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41642648 kB' 'MemAvailable: 43267268 kB' 'Buffers: 4076 kB' 'Cached: 11289892 kB' 'SwapCached: 76 kB' 'Active: 8699328 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792172 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597532 kB' 'Mapped: 143152 kB' 'Shmem: 9548864 kB' 'KReclaimable: 576316 kB' 'Slab: 1577140 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000824 kB' 'KernelStack: 22080 kB' 'PageTables: 8960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12070764 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.783 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.784 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:33.785 nr_hugepages=1024 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:33.785 resv_hugepages=0 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:33.785 surplus_hugepages=0 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:33.785 anon_hugepages=0 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41642588 kB' 'MemAvailable: 43267208 kB' 'Buffers: 4076 kB' 'Cached: 11289896 kB' 'SwapCached: 76 kB' 'Active: 8699544 kB' 'Inactive: 3189020 kB' 'Active(anon): 7792388 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597744 kB' 'Mapped: 143152 kB' 'Shmem: 9548868 kB' 'KReclaimable: 576316 kB' 'Slab: 1577140 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000824 kB' 'KernelStack: 22016 kB' 'PageTables: 8932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12070784 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.785 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.786 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 21747512 kB' 'MemUsed: 10886924 kB' 'SwapCached: 44 kB' 'Active: 5916612 kB' 'Inactive: 532564 kB' 'Active(anon): 5139184 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6166656 kB' 'Mapped: 86688 kB' 'AnonPages: 285212 kB' 'Shmem: 4856676 kB' 'KernelStack: 11144 kB' 'PageTables: 4968 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 402364 kB' 'Slab: 886504 kB' 'SReclaimable: 402364 kB' 'SUnreclaim: 484140 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.787 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:33.788 node0=1024 expecting 1024 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:33.788 12:01:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:37.078 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:37.078 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:37.078 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41649620 kB' 'MemAvailable: 43274240 kB' 'Buffers: 4076 kB' 'Cached: 11290040 kB' 'SwapCached: 76 kB' 'Active: 8701744 kB' 'Inactive: 3189020 kB' 'Active(anon): 7794588 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599296 kB' 'Mapped: 143184 kB' 'Shmem: 9549012 kB' 'KReclaimable: 576316 kB' 'Slab: 1576948 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000632 kB' 'KernelStack: 22112 kB' 'PageTables: 8724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12071772 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.078 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.079 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41651588 kB' 'MemAvailable: 43276208 kB' 'Buffers: 4076 kB' 'Cached: 11290040 kB' 'SwapCached: 76 kB' 'Active: 8700516 kB' 'Inactive: 3189020 kB' 'Active(anon): 7793360 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598524 kB' 'Mapped: 143080 kB' 'Shmem: 9549012 kB' 'KReclaimable: 576316 kB' 'Slab: 1576940 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000624 kB' 'KernelStack: 22160 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12071788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.080 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.081 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41650780 kB' 'MemAvailable: 43275400 kB' 'Buffers: 4076 kB' 'Cached: 11290064 kB' 'SwapCached: 76 kB' 'Active: 8700780 kB' 'Inactive: 3189020 kB' 'Active(anon): 7793624 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598792 kB' 'Mapped: 143080 kB' 'Shmem: 9549036 kB' 'KReclaimable: 576316 kB' 'Slab: 1576940 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000624 kB' 'KernelStack: 22032 kB' 'PageTables: 8984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12071812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.083 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:37.084 nr_hugepages=1024 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:37.084 resv_hugepages=0 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:37.084 surplus_hugepages=0 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:37.084 anon_hugepages=0 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41651284 kB' 'MemAvailable: 43275904 kB' 'Buffers: 4076 kB' 'Cached: 11290080 kB' 'SwapCached: 76 kB' 'Active: 8701124 kB' 'Inactive: 3189020 kB' 'Active(anon): 7793968 kB' 'Inactive(anon): 2351072 kB' 'Active(file): 907156 kB' 'Inactive(file): 837948 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599136 kB' 'Mapped: 143080 kB' 'Shmem: 9549052 kB' 'KReclaimable: 576316 kB' 'Slab: 1576940 kB' 'SReclaimable: 576316 kB' 'SUnreclaim: 1000624 kB' 'KernelStack: 21936 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12071832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 113344 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:37.085 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 21757440 kB' 'MemUsed: 10876996 kB' 'SwapCached: 44 kB' 'Active: 5915604 kB' 'Inactive: 532564 kB' 'Active(anon): 5138176 kB' 'Inactive(anon): 56 kB' 'Active(file): 777428 kB' 'Inactive(file): 532508 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6166832 kB' 'Mapped: 86616 kB' 'AnonPages: 284512 kB' 'Shmem: 4856852 kB' 'KernelStack: 11176 kB' 'PageTables: 5068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 402364 kB' 'Slab: 886328 kB' 'SReclaimable: 402364 kB' 'SUnreclaim: 483964 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:37.087 node0=1024 expecting 1024 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:37.087 00:05:37.087 real 0m6.830s 00:05:37.087 user 0m2.486s 00:05:37.087 sys 0m4.406s 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:37.087 12:01:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:37.087 ************************************ 00:05:37.087 END TEST no_shrink_alloc 00:05:37.087 ************************************ 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:37.087 12:01:05 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:37.087 00:05:37.087 real 0m22.717s 00:05:37.087 user 0m7.733s 00:05:37.087 sys 0m13.637s 00:05:37.087 12:01:05 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:37.087 12:01:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:37.087 ************************************ 00:05:37.087 END TEST hugepages 00:05:37.087 ************************************ 00:05:37.087 12:01:05 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:37.087 12:01:05 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:37.087 12:01:05 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:37.087 12:01:05 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:37.345 ************************************ 00:05:37.345 START TEST driver 00:05:37.345 ************************************ 00:05:37.345 12:01:05 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:37.345 * Looking for test storage... 00:05:37.345 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1681 -- # lcov --version 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:37.345 12:01:06 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:37.345 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.345 --rc genhtml_branch_coverage=1 00:05:37.345 --rc genhtml_function_coverage=1 00:05:37.345 --rc genhtml_legend=1 00:05:37.345 --rc geninfo_all_blocks=1 00:05:37.345 --rc geninfo_unexecuted_blocks=1 00:05:37.345 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.345 ' 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:37.345 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.345 --rc genhtml_branch_coverage=1 00:05:37.345 --rc genhtml_function_coverage=1 00:05:37.345 --rc genhtml_legend=1 00:05:37.345 --rc geninfo_all_blocks=1 00:05:37.345 --rc geninfo_unexecuted_blocks=1 00:05:37.345 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.345 ' 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:37.345 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.345 --rc genhtml_branch_coverage=1 00:05:37.345 --rc genhtml_function_coverage=1 00:05:37.345 --rc genhtml_legend=1 00:05:37.345 --rc geninfo_all_blocks=1 00:05:37.345 --rc geninfo_unexecuted_blocks=1 00:05:37.345 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.345 ' 00:05:37.345 12:01:06 setup.sh.driver -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:37.345 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.345 --rc genhtml_branch_coverage=1 00:05:37.345 --rc genhtml_function_coverage=1 00:05:37.345 --rc genhtml_legend=1 00:05:37.345 --rc geninfo_all_blocks=1 00:05:37.345 --rc geninfo_unexecuted_blocks=1 00:05:37.345 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.345 ' 00:05:37.345 12:01:06 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:37.345 12:01:06 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:37.345 12:01:06 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:42.708 12:01:10 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:42.708 12:01:10 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:42.708 12:01:10 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:42.708 12:01:10 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:42.708 ************************************ 00:05:42.708 START TEST guess_driver 00:05:42.708 ************************************ 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:42.708 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:42.708 Looking for driver=vfio-pci 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.708 12:01:10 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:45.996 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.996 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.996 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.996 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.996 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.996 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.997 12:01:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:47.375 12:01:15 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:51.566 00:05:51.566 real 0m9.288s 00:05:51.566 user 0m2.383s 00:05:51.566 sys 0m4.663s 00:05:51.566 12:01:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:51.566 12:01:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:51.566 ************************************ 00:05:51.566 END TEST guess_driver 00:05:51.566 ************************************ 00:05:51.566 00:05:51.566 real 0m14.283s 00:05:51.566 user 0m3.859s 00:05:51.566 sys 0m7.433s 00:05:51.566 12:01:20 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:51.566 12:01:20 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:51.566 ************************************ 00:05:51.566 END TEST driver 00:05:51.566 ************************************ 00:05:51.566 12:01:20 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:51.566 12:01:20 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:51.566 12:01:20 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:51.566 12:01:20 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:51.566 ************************************ 00:05:51.566 START TEST devices 00:05:51.566 ************************************ 00:05:51.566 12:01:20 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:51.566 * Looking for test storage... 00:05:51.566 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:51.566 12:01:20 setup.sh.devices -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:51.566 12:01:20 setup.sh.devices -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:51.566 12:01:20 setup.sh.devices -- common/autotest_common.sh@1681 -- # lcov --version 00:05:51.825 12:01:20 setup.sh.devices -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:51.825 12:01:20 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:51.825 12:01:20 setup.sh.devices -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:51.825 12:01:20 setup.sh.devices -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:51.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.825 --rc genhtml_branch_coverage=1 00:05:51.825 --rc genhtml_function_coverage=1 00:05:51.825 --rc genhtml_legend=1 00:05:51.825 --rc geninfo_all_blocks=1 00:05:51.825 --rc geninfo_unexecuted_blocks=1 00:05:51.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.825 ' 00:05:51.825 12:01:20 setup.sh.devices -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:51.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.825 --rc genhtml_branch_coverage=1 00:05:51.825 --rc genhtml_function_coverage=1 00:05:51.825 --rc genhtml_legend=1 00:05:51.825 --rc geninfo_all_blocks=1 00:05:51.825 --rc geninfo_unexecuted_blocks=1 00:05:51.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.825 ' 00:05:51.825 12:01:20 setup.sh.devices -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:51.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.825 --rc genhtml_branch_coverage=1 00:05:51.825 --rc genhtml_function_coverage=1 00:05:51.825 --rc genhtml_legend=1 00:05:51.825 --rc geninfo_all_blocks=1 00:05:51.825 --rc geninfo_unexecuted_blocks=1 00:05:51.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.825 ' 00:05:51.825 12:01:20 setup.sh.devices -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:51.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.825 --rc genhtml_branch_coverage=1 00:05:51.825 --rc genhtml_function_coverage=1 00:05:51.825 --rc genhtml_legend=1 00:05:51.825 --rc geninfo_all_blocks=1 00:05:51.825 --rc geninfo_unexecuted_blocks=1 00:05:51.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.825 ' 00:05:51.825 12:01:20 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:51.825 12:01:20 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:51.825 12:01:20 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:51.825 12:01:20 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1656 -- # local nvme bdf 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:56.014 12:01:24 setup.sh.devices -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:56.014 12:01:24 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:56.014 12:01:24 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:56.014 12:01:24 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:56.014 No valid GPT data, bailing 00:05:56.014 12:01:24 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:56.015 12:01:24 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:56.015 12:01:24 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:56.015 12:01:24 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:56.015 12:01:24 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:56.015 12:01:24 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:56.015 12:01:24 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:56.015 12:01:24 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:56.015 12:01:24 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:56.015 12:01:24 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:56.015 ************************************ 00:05:56.015 START TEST nvme_mount 00:05:56.015 ************************************ 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:56.015 12:01:24 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:56.580 Creating new GPT entries in memory. 00:05:56.580 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:56.580 other utilities. 00:05:56.580 12:01:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:56.580 12:01:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:56.580 12:01:25 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:56.580 12:01:25 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:56.580 12:01:25 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:57.513 Creating new GPT entries in memory. 00:05:57.513 The operation has completed successfully. 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1692110 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:57.513 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:57.771 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:57.772 12:01:26 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.300 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:00.558 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:00.558 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:00.816 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:00.816 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:00.816 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:00.816 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:00.816 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:00.816 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:00.816 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.816 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:00.816 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:01.074 12:01:29 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:04.353 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:04.354 12:01:32 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:07.638 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:07.638 00:06:07.638 real 0m12.006s 00:06:07.638 user 0m3.280s 00:06:07.638 sys 0m6.469s 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:07.638 12:01:36 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:07.638 ************************************ 00:06:07.638 END TEST nvme_mount 00:06:07.638 ************************************ 00:06:07.638 12:01:36 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:07.638 12:01:36 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:07.638 12:01:36 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:07.638 12:01:36 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:07.638 ************************************ 00:06:07.638 START TEST dm_mount 00:06:07.638 ************************************ 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:07.638 12:01:36 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:08.575 Creating new GPT entries in memory. 00:06:08.575 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:08.575 other utilities. 00:06:08.575 12:01:37 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:08.575 12:01:37 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:08.575 12:01:37 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:08.575 12:01:37 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:08.575 12:01:37 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:09.511 Creating new GPT entries in memory. 00:06:09.511 The operation has completed successfully. 00:06:09.511 12:01:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:09.511 12:01:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:09.511 12:01:38 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:09.511 12:01:38 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:09.511 12:01:38 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:10.897 The operation has completed successfully. 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1696419 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:10.897 12:01:39 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:14.183 12:01:42 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.469 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:17.470 12:01:45 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:17.470 12:01:46 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:17.470 12:01:46 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:17.470 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:17.470 12:01:46 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:17.470 12:01:46 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:17.470 00:06:17.470 real 0m9.690s 00:06:17.470 user 0m2.278s 00:06:17.470 sys 0m4.460s 00:06:17.470 12:01:46 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:17.470 12:01:46 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:17.470 ************************************ 00:06:17.470 END TEST dm_mount 00:06:17.470 ************************************ 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:17.470 12:01:46 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:17.729 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:17.729 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:17.729 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:17.729 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:17.729 12:01:46 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:17.729 00:06:17.729 real 0m26.038s 00:06:17.729 user 0m7.136s 00:06:17.729 sys 0m13.621s 00:06:17.729 12:01:46 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:17.729 12:01:46 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:17.729 ************************************ 00:06:17.729 END TEST devices 00:06:17.729 ************************************ 00:06:17.729 00:06:17.729 real 1m27.794s 00:06:17.729 user 0m26.564s 00:06:17.729 sys 0m49.623s 00:06:17.729 12:01:46 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:17.729 12:01:46 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:17.729 ************************************ 00:06:17.729 END TEST setup.sh 00:06:17.729 ************************************ 00:06:17.729 12:01:46 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:20.261 Hugepages 00:06:20.261 node hugesize free / total 00:06:20.261 node0 1048576kB 0 / 0 00:06:20.261 node0 2048kB 1024 / 1024 00:06:20.261 node1 1048576kB 0 / 0 00:06:20.261 node1 2048kB 1024 / 1024 00:06:20.261 00:06:20.261 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:20.261 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:20.261 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:20.519 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:20.519 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:20.519 12:01:49 -- spdk/autotest.sh@117 -- # uname -s 00:06:20.519 12:01:49 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:20.519 12:01:49 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:20.519 12:01:49 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:23.802 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:23.802 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:25.705 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:25.705 12:01:54 -- common/autotest_common.sh@1515 -- # sleep 1 00:06:26.642 12:01:55 -- common/autotest_common.sh@1516 -- # bdfs=() 00:06:26.642 12:01:55 -- common/autotest_common.sh@1516 -- # local bdfs 00:06:26.642 12:01:55 -- common/autotest_common.sh@1518 -- # bdfs=($(get_nvme_bdfs)) 00:06:26.642 12:01:55 -- common/autotest_common.sh@1518 -- # get_nvme_bdfs 00:06:26.642 12:01:55 -- common/autotest_common.sh@1496 -- # bdfs=() 00:06:26.642 12:01:55 -- common/autotest_common.sh@1496 -- # local bdfs 00:06:26.642 12:01:55 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:26.642 12:01:55 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:26.642 12:01:55 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:06:26.642 12:01:55 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:06:26.642 12:01:55 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:06:26.642 12:01:55 -- common/autotest_common.sh@1520 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:29.930 Waiting for block devices as requested 00:06:29.930 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:29.930 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:29.930 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:29.930 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:29.930 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:29.930 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:30.188 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:30.189 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:30.189 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:30.447 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:30.447 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:30.447 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:30.706 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:30.706 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:30.706 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:30.965 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:30.965 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:31.225 12:01:59 -- common/autotest_common.sh@1522 -- # for bdf in "${bdfs[@]}" 00:06:31.225 12:01:59 -- common/autotest_common.sh@1523 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1485 -- # readlink -f /sys/class/nvme/nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1485 -- # grep 0000:d8:00.0/nvme/nvme 00:06:31.225 12:01:59 -- common/autotest_common.sh@1485 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1486 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:31.225 12:01:59 -- common/autotest_common.sh@1490 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1490 -- # printf '%s\n' nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1523 -- # nvme_ctrlr=/dev/nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1524 -- # [[ -z /dev/nvme0 ]] 00:06:31.225 12:01:59 -- common/autotest_common.sh@1529 -- # cut -d: -f2 00:06:31.225 12:01:59 -- common/autotest_common.sh@1529 -- # nvme id-ctrl /dev/nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1529 -- # grep oacs 00:06:31.225 12:01:59 -- common/autotest_common.sh@1529 -- # oacs=' 0xe' 00:06:31.225 12:01:59 -- common/autotest_common.sh@1530 -- # oacs_ns_manage=8 00:06:31.225 12:01:59 -- common/autotest_common.sh@1532 -- # [[ 8 -ne 0 ]] 00:06:31.225 12:01:59 -- common/autotest_common.sh@1538 -- # nvme id-ctrl /dev/nvme0 00:06:31.225 12:01:59 -- common/autotest_common.sh@1538 -- # grep unvmcap 00:06:31.225 12:01:59 -- common/autotest_common.sh@1538 -- # cut -d: -f2 00:06:31.225 12:01:59 -- common/autotest_common.sh@1538 -- # unvmcap=' 0' 00:06:31.225 12:01:59 -- common/autotest_common.sh@1539 -- # [[ 0 -eq 0 ]] 00:06:31.225 12:01:59 -- common/autotest_common.sh@1541 -- # continue 00:06:31.225 12:01:59 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:31.225 12:01:59 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:31.225 12:01:59 -- common/autotest_common.sh@10 -- # set +x 00:06:31.225 12:01:59 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:31.225 12:01:59 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:31.225 12:01:59 -- common/autotest_common.sh@10 -- # set +x 00:06:31.225 12:01:59 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:34.514 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:34.514 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:34.515 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:34.515 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:34.515 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:34.515 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:34.515 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:36.012 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:36.271 12:02:04 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:36.271 12:02:04 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:36.271 12:02:04 -- common/autotest_common.sh@10 -- # set +x 00:06:36.271 12:02:04 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:36.271 12:02:04 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:06:36.271 12:02:04 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:06:36.271 12:02:04 -- common/autotest_common.sh@1561 -- # bdfs=() 00:06:36.271 12:02:04 -- common/autotest_common.sh@1561 -- # _bdfs=() 00:06:36.271 12:02:04 -- common/autotest_common.sh@1561 -- # local bdfs _bdfs 00:06:36.271 12:02:04 -- common/autotest_common.sh@1562 -- # _bdfs=($(get_nvme_bdfs)) 00:06:36.271 12:02:04 -- common/autotest_common.sh@1562 -- # get_nvme_bdfs 00:06:36.271 12:02:04 -- common/autotest_common.sh@1496 -- # bdfs=() 00:06:36.271 12:02:04 -- common/autotest_common.sh@1496 -- # local bdfs 00:06:36.271 12:02:04 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:36.271 12:02:04 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:36.271 12:02:04 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:06:36.271 12:02:05 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:06:36.271 12:02:05 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:06:36.271 12:02:05 -- common/autotest_common.sh@1563 -- # for bdf in "${_bdfs[@]}" 00:06:36.271 12:02:05 -- common/autotest_common.sh@1564 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:36.271 12:02:05 -- common/autotest_common.sh@1564 -- # device=0x0a54 00:06:36.271 12:02:05 -- common/autotest_common.sh@1565 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:36.271 12:02:05 -- common/autotest_common.sh@1566 -- # bdfs+=($bdf) 00:06:36.271 12:02:05 -- common/autotest_common.sh@1570 -- # (( 1 > 0 )) 00:06:36.271 12:02:05 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:06:36.271 12:02:05 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:06:36.271 12:02:05 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=1706051 00:06:36.271 12:02:05 -- common/autotest_common.sh@1583 -- # waitforlisten 1706051 00:06:36.271 12:02:05 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.271 12:02:05 -- common/autotest_common.sh@831 -- # '[' -z 1706051 ']' 00:06:36.271 12:02:05 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.271 12:02:05 -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:36.271 12:02:05 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.271 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.271 12:02:05 -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:36.271 12:02:05 -- common/autotest_common.sh@10 -- # set +x 00:06:36.271 [2024-11-27 12:02:05.132733] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:36.271 [2024-11-27 12:02:05.132803] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1706051 ] 00:06:36.530 [2024-11-27 12:02:05.201751] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.530 [2024-11-27 12:02:05.245142] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.789 12:02:05 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:36.789 12:02:05 -- common/autotest_common.sh@864 -- # return 0 00:06:36.789 12:02:05 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:06:36.789 12:02:05 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:06:36.789 12:02:05 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:40.078 nvme0n1 00:06:40.078 12:02:08 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:40.078 [2024-11-27 12:02:08.635569] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:40.078 request: 00:06:40.078 { 00:06:40.078 "nvme_ctrlr_name": "nvme0", 00:06:40.078 "password": "test", 00:06:40.078 "method": "bdev_nvme_opal_revert", 00:06:40.078 "req_id": 1 00:06:40.078 } 00:06:40.078 Got JSON-RPC error response 00:06:40.078 response: 00:06:40.078 { 00:06:40.078 "code": -32602, 00:06:40.078 "message": "Invalid parameters" 00:06:40.078 } 00:06:40.078 12:02:08 -- common/autotest_common.sh@1589 -- # true 00:06:40.078 12:02:08 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:06:40.078 12:02:08 -- common/autotest_common.sh@1593 -- # killprocess 1706051 00:06:40.078 12:02:08 -- common/autotest_common.sh@950 -- # '[' -z 1706051 ']' 00:06:40.078 12:02:08 -- common/autotest_common.sh@954 -- # kill -0 1706051 00:06:40.078 12:02:08 -- common/autotest_common.sh@955 -- # uname 00:06:40.078 12:02:08 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:40.078 12:02:08 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1706051 00:06:40.078 12:02:08 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:40.078 12:02:08 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:40.078 12:02:08 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1706051' 00:06:40.078 killing process with pid 1706051 00:06:40.078 12:02:08 -- common/autotest_common.sh@969 -- # kill 1706051 00:06:40.078 12:02:08 -- common/autotest_common.sh@974 -- # wait 1706051 00:06:41.983 12:02:10 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:41.983 12:02:10 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:41.983 12:02:10 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:41.983 12:02:10 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:41.983 12:02:10 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:41.983 12:02:10 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:41.983 12:02:10 -- common/autotest_common.sh@10 -- # set +x 00:06:41.983 12:02:10 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:41.983 12:02:10 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:41.983 12:02:10 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:41.983 12:02:10 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:41.983 12:02:10 -- common/autotest_common.sh@10 -- # set +x 00:06:42.243 ************************************ 00:06:42.243 START TEST env 00:06:42.243 ************************************ 00:06:42.243 12:02:10 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:42.243 * Looking for test storage... 00:06:42.243 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:42.243 12:02:10 env -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:42.243 12:02:10 env -- common/autotest_common.sh@1681 -- # lcov --version 00:06:42.243 12:02:10 env -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:42.243 12:02:11 env -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:42.243 12:02:11 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:42.243 12:02:11 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:42.243 12:02:11 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:42.243 12:02:11 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:42.243 12:02:11 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:42.243 12:02:11 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:42.243 12:02:11 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:42.243 12:02:11 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:42.243 12:02:11 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:42.243 12:02:11 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:42.243 12:02:11 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:42.243 12:02:11 env -- scripts/common.sh@344 -- # case "$op" in 00:06:42.243 12:02:11 env -- scripts/common.sh@345 -- # : 1 00:06:42.243 12:02:11 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:42.243 12:02:11 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:42.243 12:02:11 env -- scripts/common.sh@365 -- # decimal 1 00:06:42.243 12:02:11 env -- scripts/common.sh@353 -- # local d=1 00:06:42.243 12:02:11 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:42.243 12:02:11 env -- scripts/common.sh@355 -- # echo 1 00:06:42.243 12:02:11 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:42.243 12:02:11 env -- scripts/common.sh@366 -- # decimal 2 00:06:42.243 12:02:11 env -- scripts/common.sh@353 -- # local d=2 00:06:42.243 12:02:11 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:42.243 12:02:11 env -- scripts/common.sh@355 -- # echo 2 00:06:42.243 12:02:11 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:42.243 12:02:11 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:42.243 12:02:11 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:42.243 12:02:11 env -- scripts/common.sh@368 -- # return 0 00:06:42.243 12:02:11 env -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:42.243 12:02:11 env -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:42.243 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.243 --rc genhtml_branch_coverage=1 00:06:42.243 --rc genhtml_function_coverage=1 00:06:42.243 --rc genhtml_legend=1 00:06:42.243 --rc geninfo_all_blocks=1 00:06:42.243 --rc geninfo_unexecuted_blocks=1 00:06:42.243 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.243 ' 00:06:42.243 12:02:11 env -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:42.243 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.243 --rc genhtml_branch_coverage=1 00:06:42.243 --rc genhtml_function_coverage=1 00:06:42.243 --rc genhtml_legend=1 00:06:42.243 --rc geninfo_all_blocks=1 00:06:42.243 --rc geninfo_unexecuted_blocks=1 00:06:42.244 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.244 ' 00:06:42.244 12:02:11 env -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:42.244 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.244 --rc genhtml_branch_coverage=1 00:06:42.244 --rc genhtml_function_coverage=1 00:06:42.244 --rc genhtml_legend=1 00:06:42.244 --rc geninfo_all_blocks=1 00:06:42.244 --rc geninfo_unexecuted_blocks=1 00:06:42.244 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.244 ' 00:06:42.244 12:02:11 env -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:42.244 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.244 --rc genhtml_branch_coverage=1 00:06:42.244 --rc genhtml_function_coverage=1 00:06:42.244 --rc genhtml_legend=1 00:06:42.244 --rc geninfo_all_blocks=1 00:06:42.244 --rc geninfo_unexecuted_blocks=1 00:06:42.244 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.244 ' 00:06:42.244 12:02:11 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:42.244 12:02:11 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:42.244 12:02:11 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:42.244 12:02:11 env -- common/autotest_common.sh@10 -- # set +x 00:06:42.244 ************************************ 00:06:42.244 START TEST env_memory 00:06:42.244 ************************************ 00:06:42.244 12:02:11 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:42.244 00:06:42.244 00:06:42.244 CUnit - A unit testing framework for C - Version 2.1-3 00:06:42.244 http://cunit.sourceforge.net/ 00:06:42.244 00:06:42.244 00:06:42.244 Suite: memory 00:06:42.505 Test: alloc and free memory map ...[2024-11-27 12:02:11.128259] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:42.505 passed 00:06:42.505 Test: mem map translation ...[2024-11-27 12:02:11.141259] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:42.505 [2024-11-27 12:02:11.141278] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:42.505 [2024-11-27 12:02:11.141310] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:42.505 [2024-11-27 12:02:11.141320] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:42.505 passed 00:06:42.505 Test: mem map registration ...[2024-11-27 12:02:11.161868] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:42.505 [2024-11-27 12:02:11.161885] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:42.505 passed 00:06:42.505 Test: mem map adjacent registrations ...passed 00:06:42.505 00:06:42.505 Run Summary: Type Total Ran Passed Failed Inactive 00:06:42.505 suites 1 1 n/a 0 0 00:06:42.505 tests 4 4 4 0 0 00:06:42.505 asserts 152 152 152 0 n/a 00:06:42.505 00:06:42.505 Elapsed time = 0.082 seconds 00:06:42.505 00:06:42.505 real 0m0.096s 00:06:42.505 user 0m0.084s 00:06:42.505 sys 0m0.012s 00:06:42.505 12:02:11 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:42.505 12:02:11 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:42.505 ************************************ 00:06:42.505 END TEST env_memory 00:06:42.505 ************************************ 00:06:42.505 12:02:11 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:42.505 12:02:11 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:42.505 12:02:11 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:42.505 12:02:11 env -- common/autotest_common.sh@10 -- # set +x 00:06:42.505 ************************************ 00:06:42.505 START TEST env_vtophys 00:06:42.505 ************************************ 00:06:42.505 12:02:11 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:42.505 EAL: lib.eal log level changed from notice to debug 00:06:42.505 EAL: Detected lcore 0 as core 0 on socket 0 00:06:42.505 EAL: Detected lcore 1 as core 1 on socket 0 00:06:42.505 EAL: Detected lcore 2 as core 2 on socket 0 00:06:42.505 EAL: Detected lcore 3 as core 3 on socket 0 00:06:42.505 EAL: Detected lcore 4 as core 4 on socket 0 00:06:42.505 EAL: Detected lcore 5 as core 5 on socket 0 00:06:42.505 EAL: Detected lcore 6 as core 6 on socket 0 00:06:42.505 EAL: Detected lcore 7 as core 8 on socket 0 00:06:42.505 EAL: Detected lcore 8 as core 9 on socket 0 00:06:42.505 EAL: Detected lcore 9 as core 10 on socket 0 00:06:42.505 EAL: Detected lcore 10 as core 11 on socket 0 00:06:42.505 EAL: Detected lcore 11 as core 12 on socket 0 00:06:42.505 EAL: Detected lcore 12 as core 13 on socket 0 00:06:42.505 EAL: Detected lcore 13 as core 14 on socket 0 00:06:42.505 EAL: Detected lcore 14 as core 16 on socket 0 00:06:42.505 EAL: Detected lcore 15 as core 17 on socket 0 00:06:42.505 EAL: Detected lcore 16 as core 18 on socket 0 00:06:42.505 EAL: Detected lcore 17 as core 19 on socket 0 00:06:42.505 EAL: Detected lcore 18 as core 20 on socket 0 00:06:42.505 EAL: Detected lcore 19 as core 21 on socket 0 00:06:42.505 EAL: Detected lcore 20 as core 22 on socket 0 00:06:42.505 EAL: Detected lcore 21 as core 24 on socket 0 00:06:42.505 EAL: Detected lcore 22 as core 25 on socket 0 00:06:42.505 EAL: Detected lcore 23 as core 26 on socket 0 00:06:42.505 EAL: Detected lcore 24 as core 27 on socket 0 00:06:42.505 EAL: Detected lcore 25 as core 28 on socket 0 00:06:42.505 EAL: Detected lcore 26 as core 29 on socket 0 00:06:42.505 EAL: Detected lcore 27 as core 30 on socket 0 00:06:42.505 EAL: Detected lcore 28 as core 0 on socket 1 00:06:42.505 EAL: Detected lcore 29 as core 1 on socket 1 00:06:42.505 EAL: Detected lcore 30 as core 2 on socket 1 00:06:42.505 EAL: Detected lcore 31 as core 3 on socket 1 00:06:42.505 EAL: Detected lcore 32 as core 4 on socket 1 00:06:42.505 EAL: Detected lcore 33 as core 5 on socket 1 00:06:42.505 EAL: Detected lcore 34 as core 6 on socket 1 00:06:42.505 EAL: Detected lcore 35 as core 8 on socket 1 00:06:42.505 EAL: Detected lcore 36 as core 9 on socket 1 00:06:42.506 EAL: Detected lcore 37 as core 10 on socket 1 00:06:42.506 EAL: Detected lcore 38 as core 11 on socket 1 00:06:42.506 EAL: Detected lcore 39 as core 12 on socket 1 00:06:42.506 EAL: Detected lcore 40 as core 13 on socket 1 00:06:42.506 EAL: Detected lcore 41 as core 14 on socket 1 00:06:42.506 EAL: Detected lcore 42 as core 16 on socket 1 00:06:42.506 EAL: Detected lcore 43 as core 17 on socket 1 00:06:42.506 EAL: Detected lcore 44 as core 18 on socket 1 00:06:42.506 EAL: Detected lcore 45 as core 19 on socket 1 00:06:42.506 EAL: Detected lcore 46 as core 20 on socket 1 00:06:42.506 EAL: Detected lcore 47 as core 21 on socket 1 00:06:42.506 EAL: Detected lcore 48 as core 22 on socket 1 00:06:42.506 EAL: Detected lcore 49 as core 24 on socket 1 00:06:42.506 EAL: Detected lcore 50 as core 25 on socket 1 00:06:42.506 EAL: Detected lcore 51 as core 26 on socket 1 00:06:42.506 EAL: Detected lcore 52 as core 27 on socket 1 00:06:42.506 EAL: Detected lcore 53 as core 28 on socket 1 00:06:42.506 EAL: Detected lcore 54 as core 29 on socket 1 00:06:42.506 EAL: Detected lcore 55 as core 30 on socket 1 00:06:42.506 EAL: Detected lcore 56 as core 0 on socket 0 00:06:42.506 EAL: Detected lcore 57 as core 1 on socket 0 00:06:42.506 EAL: Detected lcore 58 as core 2 on socket 0 00:06:42.506 EAL: Detected lcore 59 as core 3 on socket 0 00:06:42.506 EAL: Detected lcore 60 as core 4 on socket 0 00:06:42.506 EAL: Detected lcore 61 as core 5 on socket 0 00:06:42.506 EAL: Detected lcore 62 as core 6 on socket 0 00:06:42.506 EAL: Detected lcore 63 as core 8 on socket 0 00:06:42.506 EAL: Detected lcore 64 as core 9 on socket 0 00:06:42.506 EAL: Detected lcore 65 as core 10 on socket 0 00:06:42.506 EAL: Detected lcore 66 as core 11 on socket 0 00:06:42.506 EAL: Detected lcore 67 as core 12 on socket 0 00:06:42.506 EAL: Detected lcore 68 as core 13 on socket 0 00:06:42.506 EAL: Detected lcore 69 as core 14 on socket 0 00:06:42.506 EAL: Detected lcore 70 as core 16 on socket 0 00:06:42.506 EAL: Detected lcore 71 as core 17 on socket 0 00:06:42.506 EAL: Detected lcore 72 as core 18 on socket 0 00:06:42.506 EAL: Detected lcore 73 as core 19 on socket 0 00:06:42.506 EAL: Detected lcore 74 as core 20 on socket 0 00:06:42.506 EAL: Detected lcore 75 as core 21 on socket 0 00:06:42.506 EAL: Detected lcore 76 as core 22 on socket 0 00:06:42.506 EAL: Detected lcore 77 as core 24 on socket 0 00:06:42.506 EAL: Detected lcore 78 as core 25 on socket 0 00:06:42.506 EAL: Detected lcore 79 as core 26 on socket 0 00:06:42.506 EAL: Detected lcore 80 as core 27 on socket 0 00:06:42.506 EAL: Detected lcore 81 as core 28 on socket 0 00:06:42.506 EAL: Detected lcore 82 as core 29 on socket 0 00:06:42.506 EAL: Detected lcore 83 as core 30 on socket 0 00:06:42.506 EAL: Detected lcore 84 as core 0 on socket 1 00:06:42.506 EAL: Detected lcore 85 as core 1 on socket 1 00:06:42.506 EAL: Detected lcore 86 as core 2 on socket 1 00:06:42.506 EAL: Detected lcore 87 as core 3 on socket 1 00:06:42.506 EAL: Detected lcore 88 as core 4 on socket 1 00:06:42.506 EAL: Detected lcore 89 as core 5 on socket 1 00:06:42.506 EAL: Detected lcore 90 as core 6 on socket 1 00:06:42.506 EAL: Detected lcore 91 as core 8 on socket 1 00:06:42.506 EAL: Detected lcore 92 as core 9 on socket 1 00:06:42.506 EAL: Detected lcore 93 as core 10 on socket 1 00:06:42.506 EAL: Detected lcore 94 as core 11 on socket 1 00:06:42.506 EAL: Detected lcore 95 as core 12 on socket 1 00:06:42.506 EAL: Detected lcore 96 as core 13 on socket 1 00:06:42.506 EAL: Detected lcore 97 as core 14 on socket 1 00:06:42.506 EAL: Detected lcore 98 as core 16 on socket 1 00:06:42.506 EAL: Detected lcore 99 as core 17 on socket 1 00:06:42.506 EAL: Detected lcore 100 as core 18 on socket 1 00:06:42.506 EAL: Detected lcore 101 as core 19 on socket 1 00:06:42.506 EAL: Detected lcore 102 as core 20 on socket 1 00:06:42.506 EAL: Detected lcore 103 as core 21 on socket 1 00:06:42.506 EAL: Detected lcore 104 as core 22 on socket 1 00:06:42.506 EAL: Detected lcore 105 as core 24 on socket 1 00:06:42.506 EAL: Detected lcore 106 as core 25 on socket 1 00:06:42.506 EAL: Detected lcore 107 as core 26 on socket 1 00:06:42.506 EAL: Detected lcore 108 as core 27 on socket 1 00:06:42.506 EAL: Detected lcore 109 as core 28 on socket 1 00:06:42.506 EAL: Detected lcore 110 as core 29 on socket 1 00:06:42.506 EAL: Detected lcore 111 as core 30 on socket 1 00:06:42.506 EAL: Maximum logical cores by configuration: 128 00:06:42.506 EAL: Detected CPU lcores: 112 00:06:42.506 EAL: Detected NUMA nodes: 2 00:06:42.506 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:06:42.506 EAL: Checking presence of .so 'librte_eal.so.24' 00:06:42.506 EAL: Checking presence of .so 'librte_eal.so' 00:06:42.506 EAL: Detected static linkage of DPDK 00:06:42.506 EAL: No shared files mode enabled, IPC will be disabled 00:06:42.506 EAL: Bus pci wants IOVA as 'DC' 00:06:42.506 EAL: Buses did not request a specific IOVA mode. 00:06:42.506 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:42.506 EAL: Selected IOVA mode 'VA' 00:06:42.506 EAL: Probing VFIO support... 00:06:42.506 EAL: IOMMU type 1 (Type 1) is supported 00:06:42.506 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:42.506 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:42.506 EAL: VFIO support initialized 00:06:42.506 EAL: Ask a virtual area of 0x2e000 bytes 00:06:42.506 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:42.506 EAL: Setting up physically contiguous memory... 00:06:42.506 EAL: Setting maximum number of open files to 524288 00:06:42.506 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:42.506 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:42.506 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:42.506 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:42.506 EAL: Ask a virtual area of 0x61000 bytes 00:06:42.506 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:42.506 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:42.506 EAL: Ask a virtual area of 0x400000000 bytes 00:06:42.506 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:42.506 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:42.506 EAL: Hugepages will be freed exactly as allocated. 00:06:42.506 EAL: No shared files mode enabled, IPC is disabled 00:06:42.506 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: TSC frequency is ~2500000 KHz 00:06:42.507 EAL: Main lcore 0 is ready (tid=7f1e37a9aa00;cpuset=[0]) 00:06:42.507 EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.507 EAL: Restoring previous memory policy: 0 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was expanded by 2MB 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Mem event callback 'spdk:(nil)' registered 00:06:42.507 00:06:42.507 00:06:42.507 CUnit - A unit testing framework for C - Version 2.1-3 00:06:42.507 http://cunit.sourceforge.net/ 00:06:42.507 00:06:42.507 00:06:42.507 Suite: components_suite 00:06:42.507 Test: vtophys_malloc_test ...passed 00:06:42.507 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.507 EAL: Restoring previous memory policy: 4 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was expanded by 4MB 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was shrunk by 4MB 00:06:42.507 EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.507 EAL: Restoring previous memory policy: 4 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was expanded by 6MB 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was shrunk by 6MB 00:06:42.507 EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.507 EAL: Restoring previous memory policy: 4 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was expanded by 10MB 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was shrunk by 10MB 00:06:42.507 EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.507 EAL: Restoring previous memory policy: 4 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was expanded by 18MB 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was shrunk by 18MB 00:06:42.507 EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.507 EAL: Restoring previous memory policy: 4 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was expanded by 34MB 00:06:42.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.507 EAL: request: mp_malloc_sync 00:06:42.507 EAL: No shared files mode enabled, IPC is disabled 00:06:42.507 EAL: Heap on socket 0 was shrunk by 34MB 00:06:42.507 EAL: Trying to obtain current memory policy. 00:06:42.507 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.767 EAL: Restoring previous memory policy: 4 00:06:42.767 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.767 EAL: request: mp_malloc_sync 00:06:42.767 EAL: No shared files mode enabled, IPC is disabled 00:06:42.767 EAL: Heap on socket 0 was expanded by 66MB 00:06:42.767 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.767 EAL: request: mp_malloc_sync 00:06:42.767 EAL: No shared files mode enabled, IPC is disabled 00:06:42.767 EAL: Heap on socket 0 was shrunk by 66MB 00:06:42.767 EAL: Trying to obtain current memory policy. 00:06:42.767 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.767 EAL: Restoring previous memory policy: 4 00:06:42.767 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.767 EAL: request: mp_malloc_sync 00:06:42.767 EAL: No shared files mode enabled, IPC is disabled 00:06:42.767 EAL: Heap on socket 0 was expanded by 130MB 00:06:42.767 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.767 EAL: request: mp_malloc_sync 00:06:42.767 EAL: No shared files mode enabled, IPC is disabled 00:06:42.767 EAL: Heap on socket 0 was shrunk by 130MB 00:06:42.767 EAL: Trying to obtain current memory policy. 00:06:42.767 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:42.767 EAL: Restoring previous memory policy: 4 00:06:42.767 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.767 EAL: request: mp_malloc_sync 00:06:42.767 EAL: No shared files mode enabled, IPC is disabled 00:06:42.767 EAL: Heap on socket 0 was expanded by 258MB 00:06:42.767 EAL: Calling mem event callback 'spdk:(nil)' 00:06:42.767 EAL: request: mp_malloc_sync 00:06:42.767 EAL: No shared files mode enabled, IPC is disabled 00:06:42.767 EAL: Heap on socket 0 was shrunk by 258MB 00:06:42.767 EAL: Trying to obtain current memory policy. 00:06:42.767 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:43.027 EAL: Restoring previous memory policy: 4 00:06:43.027 EAL: Calling mem event callback 'spdk:(nil)' 00:06:43.027 EAL: request: mp_malloc_sync 00:06:43.027 EAL: No shared files mode enabled, IPC is disabled 00:06:43.027 EAL: Heap on socket 0 was expanded by 514MB 00:06:43.027 EAL: Calling mem event callback 'spdk:(nil)' 00:06:43.027 EAL: request: mp_malloc_sync 00:06:43.027 EAL: No shared files mode enabled, IPC is disabled 00:06:43.027 EAL: Heap on socket 0 was shrunk by 514MB 00:06:43.027 EAL: Trying to obtain current memory policy. 00:06:43.027 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:43.286 EAL: Restoring previous memory policy: 4 00:06:43.286 EAL: Calling mem event callback 'spdk:(nil)' 00:06:43.286 EAL: request: mp_malloc_sync 00:06:43.286 EAL: No shared files mode enabled, IPC is disabled 00:06:43.286 EAL: Heap on socket 0 was expanded by 1026MB 00:06:43.546 EAL: Calling mem event callback 'spdk:(nil)' 00:06:43.546 EAL: request: mp_malloc_sync 00:06:43.546 EAL: No shared files mode enabled, IPC is disabled 00:06:43.546 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:43.546 passed 00:06:43.546 00:06:43.546 Run Summary: Type Total Ran Passed Failed Inactive 00:06:43.546 suites 1 1 n/a 0 0 00:06:43.546 tests 2 2 2 0 0 00:06:43.546 asserts 497 497 497 0 n/a 00:06:43.546 00:06:43.546 Elapsed time = 0.970 seconds 00:06:43.546 EAL: Calling mem event callback 'spdk:(nil)' 00:06:43.546 EAL: request: mp_malloc_sync 00:06:43.546 EAL: No shared files mode enabled, IPC is disabled 00:06:43.546 EAL: Heap on socket 0 was shrunk by 2MB 00:06:43.546 EAL: No shared files mode enabled, IPC is disabled 00:06:43.546 EAL: No shared files mode enabled, IPC is disabled 00:06:43.546 EAL: No shared files mode enabled, IPC is disabled 00:06:43.546 00:06:43.546 real 0m1.095s 00:06:43.546 user 0m0.628s 00:06:43.546 sys 0m0.435s 00:06:43.546 12:02:12 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:43.546 12:02:12 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:43.546 ************************************ 00:06:43.546 END TEST env_vtophys 00:06:43.546 ************************************ 00:06:43.546 12:02:12 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:43.546 12:02:12 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:43.546 12:02:12 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:43.546 12:02:12 env -- common/autotest_common.sh@10 -- # set +x 00:06:43.806 ************************************ 00:06:43.806 START TEST env_pci 00:06:43.806 ************************************ 00:06:43.806 12:02:12 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:43.806 00:06:43.806 00:06:43.806 CUnit - A unit testing framework for C - Version 2.1-3 00:06:43.806 http://cunit.sourceforge.net/ 00:06:43.806 00:06:43.806 00:06:43.806 Suite: pci 00:06:43.806 Test: pci_hook ...[2024-11-27 12:02:12.462671] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1050:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1707351 has claimed it 00:06:43.806 EAL: Cannot find device (10000:00:01.0) 00:06:43.806 EAL: Failed to attach device on primary process 00:06:43.806 passed 00:06:43.806 00:06:43.806 Run Summary: Type Total Ran Passed Failed Inactive 00:06:43.806 suites 1 1 n/a 0 0 00:06:43.806 tests 1 1 1 0 0 00:06:43.806 asserts 25 25 25 0 n/a 00:06:43.806 00:06:43.806 Elapsed time = 0.035 seconds 00:06:43.806 00:06:43.806 real 0m0.055s 00:06:43.806 user 0m0.009s 00:06:43.806 sys 0m0.046s 00:06:43.806 12:02:12 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:43.806 12:02:12 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:43.806 ************************************ 00:06:43.806 END TEST env_pci 00:06:43.806 ************************************ 00:06:43.806 12:02:12 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:43.806 12:02:12 env -- env/env.sh@15 -- # uname 00:06:43.806 12:02:12 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:43.806 12:02:12 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:43.806 12:02:12 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:43.806 12:02:12 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:43.806 12:02:12 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:43.806 12:02:12 env -- common/autotest_common.sh@10 -- # set +x 00:06:43.806 ************************************ 00:06:43.806 START TEST env_dpdk_post_init 00:06:43.806 ************************************ 00:06:43.806 12:02:12 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:43.806 EAL: Detected CPU lcores: 112 00:06:43.806 EAL: Detected NUMA nodes: 2 00:06:43.806 EAL: Detected static linkage of DPDK 00:06:43.806 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:43.806 EAL: Selected IOVA mode 'VA' 00:06:43.806 EAL: VFIO support initialized 00:06:43.806 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:44.065 EAL: Using IOMMU type 1 (Type 1) 00:06:44.633 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:48.828 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:48.828 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:48.828 Starting DPDK initialization... 00:06:48.828 Starting SPDK post initialization... 00:06:48.828 SPDK NVMe probe 00:06:48.828 Attaching to 0000:d8:00.0 00:06:48.828 Attached to 0000:d8:00.0 00:06:48.828 Cleaning up... 00:06:48.828 00:06:48.828 real 0m4.641s 00:06:48.828 user 0m3.483s 00:06:48.828 sys 0m0.399s 00:06:48.828 12:02:17 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.828 12:02:17 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:48.828 ************************************ 00:06:48.828 END TEST env_dpdk_post_init 00:06:48.828 ************************************ 00:06:48.828 12:02:17 env -- env/env.sh@26 -- # uname 00:06:48.828 12:02:17 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:48.828 12:02:17 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:48.828 12:02:17 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:48.828 12:02:17 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.828 12:02:17 env -- common/autotest_common.sh@10 -- # set +x 00:06:48.828 ************************************ 00:06:48.828 START TEST env_mem_callbacks 00:06:48.828 ************************************ 00:06:48.828 12:02:17 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:48.828 EAL: Detected CPU lcores: 112 00:06:48.828 EAL: Detected NUMA nodes: 2 00:06:48.828 EAL: Detected static linkage of DPDK 00:06:48.828 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:48.828 EAL: Selected IOVA mode 'VA' 00:06:48.828 EAL: VFIO support initialized 00:06:48.828 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:48.828 00:06:48.828 00:06:48.828 CUnit - A unit testing framework for C - Version 2.1-3 00:06:48.828 http://cunit.sourceforge.net/ 00:06:48.828 00:06:48.828 00:06:48.828 Suite: memory 00:06:48.828 Test: test ... 00:06:48.828 register 0x200000200000 2097152 00:06:48.828 malloc 3145728 00:06:48.828 register 0x200000400000 4194304 00:06:48.828 buf 0x200000500000 len 3145728 PASSED 00:06:48.828 malloc 64 00:06:48.828 buf 0x2000004fff40 len 64 PASSED 00:06:48.828 malloc 4194304 00:06:48.828 register 0x200000800000 6291456 00:06:48.828 buf 0x200000a00000 len 4194304 PASSED 00:06:48.828 free 0x200000500000 3145728 00:06:48.828 free 0x2000004fff40 64 00:06:48.828 unregister 0x200000400000 4194304 PASSED 00:06:48.828 free 0x200000a00000 4194304 00:06:48.828 unregister 0x200000800000 6291456 PASSED 00:06:48.828 malloc 8388608 00:06:48.828 register 0x200000400000 10485760 00:06:48.828 buf 0x200000600000 len 8388608 PASSED 00:06:48.828 free 0x200000600000 8388608 00:06:48.828 unregister 0x200000400000 10485760 PASSED 00:06:48.828 passed 00:06:48.828 00:06:48.828 Run Summary: Type Total Ran Passed Failed Inactive 00:06:48.828 suites 1 1 n/a 0 0 00:06:48.828 tests 1 1 1 0 0 00:06:48.828 asserts 15 15 15 0 n/a 00:06:48.828 00:06:48.828 Elapsed time = 0.004 seconds 00:06:48.828 00:06:48.828 real 0m0.063s 00:06:48.828 user 0m0.021s 00:06:48.828 sys 0m0.041s 00:06:48.828 12:02:17 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.828 12:02:17 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:48.828 ************************************ 00:06:48.828 END TEST env_mem_callbacks 00:06:48.828 ************************************ 00:06:48.828 00:06:48.828 real 0m6.544s 00:06:48.828 user 0m4.482s 00:06:48.828 sys 0m1.317s 00:06:48.828 12:02:17 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.828 12:02:17 env -- common/autotest_common.sh@10 -- # set +x 00:06:48.828 ************************************ 00:06:48.828 END TEST env 00:06:48.828 ************************************ 00:06:48.828 12:02:17 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:48.828 12:02:17 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:48.828 12:02:17 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.828 12:02:17 -- common/autotest_common.sh@10 -- # set +x 00:06:48.828 ************************************ 00:06:48.828 START TEST rpc 00:06:48.828 ************************************ 00:06:48.828 12:02:17 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:48.828 * Looking for test storage... 00:06:48.828 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:48.828 12:02:17 rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:48.828 12:02:17 rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:48.828 12:02:17 rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:48.828 12:02:17 rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:48.829 12:02:17 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:48.829 12:02:17 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:48.829 12:02:17 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:48.829 12:02:17 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:48.829 12:02:17 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:48.829 12:02:17 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:48.829 12:02:17 rpc -- scripts/common.sh@345 -- # : 1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:48.829 12:02:17 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:48.829 12:02:17 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@353 -- # local d=1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:48.829 12:02:17 rpc -- scripts/common.sh@355 -- # echo 1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:48.829 12:02:17 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@353 -- # local d=2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:48.829 12:02:17 rpc -- scripts/common.sh@355 -- # echo 2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:48.829 12:02:17 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:48.829 12:02:17 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:48.829 12:02:17 rpc -- scripts/common.sh@368 -- # return 0 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:48.829 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.829 --rc genhtml_branch_coverage=1 00:06:48.829 --rc genhtml_function_coverage=1 00:06:48.829 --rc genhtml_legend=1 00:06:48.829 --rc geninfo_all_blocks=1 00:06:48.829 --rc geninfo_unexecuted_blocks=1 00:06:48.829 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.829 ' 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:48.829 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.829 --rc genhtml_branch_coverage=1 00:06:48.829 --rc genhtml_function_coverage=1 00:06:48.829 --rc genhtml_legend=1 00:06:48.829 --rc geninfo_all_blocks=1 00:06:48.829 --rc geninfo_unexecuted_blocks=1 00:06:48.829 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.829 ' 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:48.829 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.829 --rc genhtml_branch_coverage=1 00:06:48.829 --rc genhtml_function_coverage=1 00:06:48.829 --rc genhtml_legend=1 00:06:48.829 --rc geninfo_all_blocks=1 00:06:48.829 --rc geninfo_unexecuted_blocks=1 00:06:48.829 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.829 ' 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:48.829 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.829 --rc genhtml_branch_coverage=1 00:06:48.829 --rc genhtml_function_coverage=1 00:06:48.829 --rc genhtml_legend=1 00:06:48.829 --rc geninfo_all_blocks=1 00:06:48.829 --rc geninfo_unexecuted_blocks=1 00:06:48.829 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.829 ' 00:06:48.829 12:02:17 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1708512 00:06:48.829 12:02:17 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:48.829 12:02:17 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:48.829 12:02:17 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1708512 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@831 -- # '[' -z 1708512 ']' 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:48.829 12:02:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.829 [2024-11-27 12:02:17.696256] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:48.829 [2024-11-27 12:02:17.696336] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708512 ] 00:06:49.088 [2024-11-27 12:02:17.764552] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.089 [2024-11-27 12:02:17.804077] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:49.089 [2024-11-27 12:02:17.804120] app.c: 614:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1708512' to capture a snapshot of events at runtime. 00:06:49.089 [2024-11-27 12:02:17.804130] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:49.089 [2024-11-27 12:02:17.804138] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:49.089 [2024-11-27 12:02:17.804145] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1708512 for offline analysis/debug. 00:06:49.089 [2024-11-27 12:02:17.804165] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.349 12:02:17 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:49.349 12:02:17 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:49.349 12:02:17 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:49.349 12:02:17 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:49.349 12:02:17 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:49.349 12:02:17 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:49.349 12:02:17 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.349 12:02:17 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.349 12:02:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.349 ************************************ 00:06:49.349 START TEST rpc_integrity 00:06:49.349 ************************************ 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:49.349 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.349 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:49.350 { 00:06:49.350 "name": "Malloc0", 00:06:49.350 "aliases": [ 00:06:49.350 "9b563d38-2c6e-44bc-bc6d-b6e532aac54f" 00:06:49.350 ], 00:06:49.350 "product_name": "Malloc disk", 00:06:49.350 "block_size": 512, 00:06:49.350 "num_blocks": 16384, 00:06:49.350 "uuid": "9b563d38-2c6e-44bc-bc6d-b6e532aac54f", 00:06:49.350 "assigned_rate_limits": { 00:06:49.350 "rw_ios_per_sec": 0, 00:06:49.350 "rw_mbytes_per_sec": 0, 00:06:49.350 "r_mbytes_per_sec": 0, 00:06:49.350 "w_mbytes_per_sec": 0 00:06:49.350 }, 00:06:49.350 "claimed": false, 00:06:49.350 "zoned": false, 00:06:49.350 "supported_io_types": { 00:06:49.350 "read": true, 00:06:49.350 "write": true, 00:06:49.350 "unmap": true, 00:06:49.350 "flush": true, 00:06:49.350 "reset": true, 00:06:49.350 "nvme_admin": false, 00:06:49.350 "nvme_io": false, 00:06:49.350 "nvme_io_md": false, 00:06:49.350 "write_zeroes": true, 00:06:49.350 "zcopy": true, 00:06:49.350 "get_zone_info": false, 00:06:49.350 "zone_management": false, 00:06:49.350 "zone_append": false, 00:06:49.350 "compare": false, 00:06:49.350 "compare_and_write": false, 00:06:49.350 "abort": true, 00:06:49.350 "seek_hole": false, 00:06:49.350 "seek_data": false, 00:06:49.350 "copy": true, 00:06:49.350 "nvme_iov_md": false 00:06:49.350 }, 00:06:49.350 "memory_domains": [ 00:06:49.350 { 00:06:49.350 "dma_device_id": "system", 00:06:49.350 "dma_device_type": 1 00:06:49.350 }, 00:06:49.350 { 00:06:49.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:49.350 "dma_device_type": 2 00:06:49.350 } 00:06:49.350 ], 00:06:49.350 "driver_specific": {} 00:06:49.350 } 00:06:49.350 ]' 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.350 [2024-11-27 12:02:18.136423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:49.350 [2024-11-27 12:02:18.136454] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:49.350 [2024-11-27 12:02:18.136470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4d658c0 00:06:49.350 [2024-11-27 12:02:18.136479] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:49.350 [2024-11-27 12:02:18.137348] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:49.350 [2024-11-27 12:02:18.137371] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:49.350 Passthru0 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:49.350 { 00:06:49.350 "name": "Malloc0", 00:06:49.350 "aliases": [ 00:06:49.350 "9b563d38-2c6e-44bc-bc6d-b6e532aac54f" 00:06:49.350 ], 00:06:49.350 "product_name": "Malloc disk", 00:06:49.350 "block_size": 512, 00:06:49.350 "num_blocks": 16384, 00:06:49.350 "uuid": "9b563d38-2c6e-44bc-bc6d-b6e532aac54f", 00:06:49.350 "assigned_rate_limits": { 00:06:49.350 "rw_ios_per_sec": 0, 00:06:49.350 "rw_mbytes_per_sec": 0, 00:06:49.350 "r_mbytes_per_sec": 0, 00:06:49.350 "w_mbytes_per_sec": 0 00:06:49.350 }, 00:06:49.350 "claimed": true, 00:06:49.350 "claim_type": "exclusive_write", 00:06:49.350 "zoned": false, 00:06:49.350 "supported_io_types": { 00:06:49.350 "read": true, 00:06:49.350 "write": true, 00:06:49.350 "unmap": true, 00:06:49.350 "flush": true, 00:06:49.350 "reset": true, 00:06:49.350 "nvme_admin": false, 00:06:49.350 "nvme_io": false, 00:06:49.350 "nvme_io_md": false, 00:06:49.350 "write_zeroes": true, 00:06:49.350 "zcopy": true, 00:06:49.350 "get_zone_info": false, 00:06:49.350 "zone_management": false, 00:06:49.350 "zone_append": false, 00:06:49.350 "compare": false, 00:06:49.350 "compare_and_write": false, 00:06:49.350 "abort": true, 00:06:49.350 "seek_hole": false, 00:06:49.350 "seek_data": false, 00:06:49.350 "copy": true, 00:06:49.350 "nvme_iov_md": false 00:06:49.350 }, 00:06:49.350 "memory_domains": [ 00:06:49.350 { 00:06:49.350 "dma_device_id": "system", 00:06:49.350 "dma_device_type": 1 00:06:49.350 }, 00:06:49.350 { 00:06:49.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:49.350 "dma_device_type": 2 00:06:49.350 } 00:06:49.350 ], 00:06:49.350 "driver_specific": {} 00:06:49.350 }, 00:06:49.350 { 00:06:49.350 "name": "Passthru0", 00:06:49.350 "aliases": [ 00:06:49.350 "832af54e-8178-5302-9316-885ce7adfc08" 00:06:49.350 ], 00:06:49.350 "product_name": "passthru", 00:06:49.350 "block_size": 512, 00:06:49.350 "num_blocks": 16384, 00:06:49.350 "uuid": "832af54e-8178-5302-9316-885ce7adfc08", 00:06:49.350 "assigned_rate_limits": { 00:06:49.350 "rw_ios_per_sec": 0, 00:06:49.350 "rw_mbytes_per_sec": 0, 00:06:49.350 "r_mbytes_per_sec": 0, 00:06:49.350 "w_mbytes_per_sec": 0 00:06:49.350 }, 00:06:49.350 "claimed": false, 00:06:49.350 "zoned": false, 00:06:49.350 "supported_io_types": { 00:06:49.350 "read": true, 00:06:49.350 "write": true, 00:06:49.350 "unmap": true, 00:06:49.350 "flush": true, 00:06:49.350 "reset": true, 00:06:49.350 "nvme_admin": false, 00:06:49.350 "nvme_io": false, 00:06:49.350 "nvme_io_md": false, 00:06:49.350 "write_zeroes": true, 00:06:49.350 "zcopy": true, 00:06:49.350 "get_zone_info": false, 00:06:49.350 "zone_management": false, 00:06:49.350 "zone_append": false, 00:06:49.350 "compare": false, 00:06:49.350 "compare_and_write": false, 00:06:49.350 "abort": true, 00:06:49.350 "seek_hole": false, 00:06:49.350 "seek_data": false, 00:06:49.350 "copy": true, 00:06:49.350 "nvme_iov_md": false 00:06:49.350 }, 00:06:49.350 "memory_domains": [ 00:06:49.350 { 00:06:49.350 "dma_device_id": "system", 00:06:49.350 "dma_device_type": 1 00:06:49.350 }, 00:06:49.350 { 00:06:49.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:49.350 "dma_device_type": 2 00:06:49.350 } 00:06:49.350 ], 00:06:49.350 "driver_specific": { 00:06:49.350 "passthru": { 00:06:49.350 "name": "Passthru0", 00:06:49.350 "base_bdev_name": "Malloc0" 00:06:49.350 } 00:06:49.350 } 00:06:49.350 } 00:06:49.350 ]' 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.350 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:49.350 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:49.610 12:02:18 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:49.610 00:06:49.610 real 0m0.255s 00:06:49.610 user 0m0.144s 00:06:49.610 sys 0m0.050s 00:06:49.610 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.610 12:02:18 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:49.610 ************************************ 00:06:49.610 END TEST rpc_integrity 00:06:49.610 ************************************ 00:06:49.611 12:02:18 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:49.611 12:02:18 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.611 12:02:18 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.611 12:02:18 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.611 ************************************ 00:06:49.611 START TEST rpc_plugins 00:06:49.611 ************************************ 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:49.611 { 00:06:49.611 "name": "Malloc1", 00:06:49.611 "aliases": [ 00:06:49.611 "cd19c121-1d09-46ff-8063-22d839c03fbc" 00:06:49.611 ], 00:06:49.611 "product_name": "Malloc disk", 00:06:49.611 "block_size": 4096, 00:06:49.611 "num_blocks": 256, 00:06:49.611 "uuid": "cd19c121-1d09-46ff-8063-22d839c03fbc", 00:06:49.611 "assigned_rate_limits": { 00:06:49.611 "rw_ios_per_sec": 0, 00:06:49.611 "rw_mbytes_per_sec": 0, 00:06:49.611 "r_mbytes_per_sec": 0, 00:06:49.611 "w_mbytes_per_sec": 0 00:06:49.611 }, 00:06:49.611 "claimed": false, 00:06:49.611 "zoned": false, 00:06:49.611 "supported_io_types": { 00:06:49.611 "read": true, 00:06:49.611 "write": true, 00:06:49.611 "unmap": true, 00:06:49.611 "flush": true, 00:06:49.611 "reset": true, 00:06:49.611 "nvme_admin": false, 00:06:49.611 "nvme_io": false, 00:06:49.611 "nvme_io_md": false, 00:06:49.611 "write_zeroes": true, 00:06:49.611 "zcopy": true, 00:06:49.611 "get_zone_info": false, 00:06:49.611 "zone_management": false, 00:06:49.611 "zone_append": false, 00:06:49.611 "compare": false, 00:06:49.611 "compare_and_write": false, 00:06:49.611 "abort": true, 00:06:49.611 "seek_hole": false, 00:06:49.611 "seek_data": false, 00:06:49.611 "copy": true, 00:06:49.611 "nvme_iov_md": false 00:06:49.611 }, 00:06:49.611 "memory_domains": [ 00:06:49.611 { 00:06:49.611 "dma_device_id": "system", 00:06:49.611 "dma_device_type": 1 00:06:49.611 }, 00:06:49.611 { 00:06:49.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:49.611 "dma_device_type": 2 00:06:49.611 } 00:06:49.611 ], 00:06:49.611 "driver_specific": {} 00:06:49.611 } 00:06:49.611 ]' 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:49.611 12:02:18 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:49.611 00:06:49.611 real 0m0.115s 00:06:49.611 user 0m0.061s 00:06:49.611 sys 0m0.018s 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.611 12:02:18 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:49.611 ************************************ 00:06:49.611 END TEST rpc_plugins 00:06:49.611 ************************************ 00:06:49.870 12:02:18 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:49.870 12:02:18 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.870 12:02:18 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.870 12:02:18 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.870 ************************************ 00:06:49.870 START TEST rpc_trace_cmd_test 00:06:49.870 ************************************ 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:49.870 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1708512", 00:06:49.870 "tpoint_group_mask": "0x8", 00:06:49.870 "iscsi_conn": { 00:06:49.870 "mask": "0x2", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "scsi": { 00:06:49.870 "mask": "0x4", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "bdev": { 00:06:49.870 "mask": "0x8", 00:06:49.870 "tpoint_mask": "0xffffffffffffffff" 00:06:49.870 }, 00:06:49.870 "nvmf_rdma": { 00:06:49.870 "mask": "0x10", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "nvmf_tcp": { 00:06:49.870 "mask": "0x20", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "ftl": { 00:06:49.870 "mask": "0x40", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "blobfs": { 00:06:49.870 "mask": "0x80", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "dsa": { 00:06:49.870 "mask": "0x200", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "thread": { 00:06:49.870 "mask": "0x400", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "nvme_pcie": { 00:06:49.870 "mask": "0x800", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "iaa": { 00:06:49.870 "mask": "0x1000", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "nvme_tcp": { 00:06:49.870 "mask": "0x2000", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "bdev_nvme": { 00:06:49.870 "mask": "0x4000", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "sock": { 00:06:49.870 "mask": "0x8000", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "blob": { 00:06:49.870 "mask": "0x10000", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 }, 00:06:49.870 "bdev_raid": { 00:06:49.870 "mask": "0x20000", 00:06:49.870 "tpoint_mask": "0x0" 00:06:49.870 } 00:06:49.870 }' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 18 -gt 2 ']' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:49.870 12:02:18 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:49.870 00:06:49.870 real 0m0.172s 00:06:49.870 user 0m0.136s 00:06:49.871 sys 0m0.026s 00:06:49.871 12:02:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.871 12:02:18 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:49.871 ************************************ 00:06:49.871 END TEST rpc_trace_cmd_test 00:06:49.871 ************************************ 00:06:49.871 12:02:18 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:49.871 12:02:18 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:49.871 12:02:18 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:49.871 12:02:18 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.871 12:02:18 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.871 12:02:18 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.130 ************************************ 00:06:50.130 START TEST rpc_daemon_integrity 00:06:50.130 ************************************ 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.130 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:50.130 { 00:06:50.130 "name": "Malloc2", 00:06:50.130 "aliases": [ 00:06:50.130 "f55e6351-0c49-452a-9575-f046c6c1e9ca" 00:06:50.130 ], 00:06:50.131 "product_name": "Malloc disk", 00:06:50.131 "block_size": 512, 00:06:50.131 "num_blocks": 16384, 00:06:50.131 "uuid": "f55e6351-0c49-452a-9575-f046c6c1e9ca", 00:06:50.131 "assigned_rate_limits": { 00:06:50.131 "rw_ios_per_sec": 0, 00:06:50.131 "rw_mbytes_per_sec": 0, 00:06:50.131 "r_mbytes_per_sec": 0, 00:06:50.131 "w_mbytes_per_sec": 0 00:06:50.131 }, 00:06:50.131 "claimed": false, 00:06:50.131 "zoned": false, 00:06:50.131 "supported_io_types": { 00:06:50.131 "read": true, 00:06:50.131 "write": true, 00:06:50.131 "unmap": true, 00:06:50.131 "flush": true, 00:06:50.131 "reset": true, 00:06:50.131 "nvme_admin": false, 00:06:50.131 "nvme_io": false, 00:06:50.131 "nvme_io_md": false, 00:06:50.131 "write_zeroes": true, 00:06:50.131 "zcopy": true, 00:06:50.131 "get_zone_info": false, 00:06:50.131 "zone_management": false, 00:06:50.131 "zone_append": false, 00:06:50.131 "compare": false, 00:06:50.131 "compare_and_write": false, 00:06:50.131 "abort": true, 00:06:50.131 "seek_hole": false, 00:06:50.131 "seek_data": false, 00:06:50.131 "copy": true, 00:06:50.131 "nvme_iov_md": false 00:06:50.131 }, 00:06:50.131 "memory_domains": [ 00:06:50.131 { 00:06:50.131 "dma_device_id": "system", 00:06:50.131 "dma_device_type": 1 00:06:50.131 }, 00:06:50.131 { 00:06:50.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:50.131 "dma_device_type": 2 00:06:50.131 } 00:06:50.131 ], 00:06:50.131 "driver_specific": {} 00:06:50.131 } 00:06:50.131 ]' 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.131 [2024-11-27 12:02:18.922455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:50.131 [2024-11-27 12:02:18.922484] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:50.131 [2024-11-27 12:02:18.922502] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4e87060 00:06:50.131 [2024-11-27 12:02:18.922511] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:50.131 [2024-11-27 12:02:18.923229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:50.131 [2024-11-27 12:02:18.923250] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:50.131 Passthru0 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:50.131 { 00:06:50.131 "name": "Malloc2", 00:06:50.131 "aliases": [ 00:06:50.131 "f55e6351-0c49-452a-9575-f046c6c1e9ca" 00:06:50.131 ], 00:06:50.131 "product_name": "Malloc disk", 00:06:50.131 "block_size": 512, 00:06:50.131 "num_blocks": 16384, 00:06:50.131 "uuid": "f55e6351-0c49-452a-9575-f046c6c1e9ca", 00:06:50.131 "assigned_rate_limits": { 00:06:50.131 "rw_ios_per_sec": 0, 00:06:50.131 "rw_mbytes_per_sec": 0, 00:06:50.131 "r_mbytes_per_sec": 0, 00:06:50.131 "w_mbytes_per_sec": 0 00:06:50.131 }, 00:06:50.131 "claimed": true, 00:06:50.131 "claim_type": "exclusive_write", 00:06:50.131 "zoned": false, 00:06:50.131 "supported_io_types": { 00:06:50.131 "read": true, 00:06:50.131 "write": true, 00:06:50.131 "unmap": true, 00:06:50.131 "flush": true, 00:06:50.131 "reset": true, 00:06:50.131 "nvme_admin": false, 00:06:50.131 "nvme_io": false, 00:06:50.131 "nvme_io_md": false, 00:06:50.131 "write_zeroes": true, 00:06:50.131 "zcopy": true, 00:06:50.131 "get_zone_info": false, 00:06:50.131 "zone_management": false, 00:06:50.131 "zone_append": false, 00:06:50.131 "compare": false, 00:06:50.131 "compare_and_write": false, 00:06:50.131 "abort": true, 00:06:50.131 "seek_hole": false, 00:06:50.131 "seek_data": false, 00:06:50.131 "copy": true, 00:06:50.131 "nvme_iov_md": false 00:06:50.131 }, 00:06:50.131 "memory_domains": [ 00:06:50.131 { 00:06:50.131 "dma_device_id": "system", 00:06:50.131 "dma_device_type": 1 00:06:50.131 }, 00:06:50.131 { 00:06:50.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:50.131 "dma_device_type": 2 00:06:50.131 } 00:06:50.131 ], 00:06:50.131 "driver_specific": {} 00:06:50.131 }, 00:06:50.131 { 00:06:50.131 "name": "Passthru0", 00:06:50.131 "aliases": [ 00:06:50.131 "83f0adad-0c44-5a1f-b642-8b9a4a897fe0" 00:06:50.131 ], 00:06:50.131 "product_name": "passthru", 00:06:50.131 "block_size": 512, 00:06:50.131 "num_blocks": 16384, 00:06:50.131 "uuid": "83f0adad-0c44-5a1f-b642-8b9a4a897fe0", 00:06:50.131 "assigned_rate_limits": { 00:06:50.131 "rw_ios_per_sec": 0, 00:06:50.131 "rw_mbytes_per_sec": 0, 00:06:50.131 "r_mbytes_per_sec": 0, 00:06:50.131 "w_mbytes_per_sec": 0 00:06:50.131 }, 00:06:50.131 "claimed": false, 00:06:50.131 "zoned": false, 00:06:50.131 "supported_io_types": { 00:06:50.131 "read": true, 00:06:50.131 "write": true, 00:06:50.131 "unmap": true, 00:06:50.131 "flush": true, 00:06:50.131 "reset": true, 00:06:50.131 "nvme_admin": false, 00:06:50.131 "nvme_io": false, 00:06:50.131 "nvme_io_md": false, 00:06:50.131 "write_zeroes": true, 00:06:50.131 "zcopy": true, 00:06:50.131 "get_zone_info": false, 00:06:50.131 "zone_management": false, 00:06:50.131 "zone_append": false, 00:06:50.131 "compare": false, 00:06:50.131 "compare_and_write": false, 00:06:50.131 "abort": true, 00:06:50.131 "seek_hole": false, 00:06:50.131 "seek_data": false, 00:06:50.131 "copy": true, 00:06:50.131 "nvme_iov_md": false 00:06:50.131 }, 00:06:50.131 "memory_domains": [ 00:06:50.131 { 00:06:50.131 "dma_device_id": "system", 00:06:50.131 "dma_device_type": 1 00:06:50.131 }, 00:06:50.131 { 00:06:50.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:50.131 "dma_device_type": 2 00:06:50.131 } 00:06:50.131 ], 00:06:50.131 "driver_specific": { 00:06:50.131 "passthru": { 00:06:50.131 "name": "Passthru0", 00:06:50.131 "base_bdev_name": "Malloc2" 00:06:50.131 } 00:06:50.131 } 00:06:50.131 } 00:06:50.131 ]' 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.131 12:02:18 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.131 12:02:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.131 12:02:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:50.131 12:02:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:50.391 12:02:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:50.391 00:06:50.391 real 0m0.238s 00:06:50.391 user 0m0.148s 00:06:50.391 sys 0m0.030s 00:06:50.391 12:02:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.391 12:02:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:50.391 ************************************ 00:06:50.391 END TEST rpc_daemon_integrity 00:06:50.391 ************************************ 00:06:50.391 12:02:19 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:50.391 12:02:19 rpc -- rpc/rpc.sh@84 -- # killprocess 1708512 00:06:50.391 12:02:19 rpc -- common/autotest_common.sh@950 -- # '[' -z 1708512 ']' 00:06:50.391 12:02:19 rpc -- common/autotest_common.sh@954 -- # kill -0 1708512 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@955 -- # uname 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1708512 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1708512' 00:06:50.392 killing process with pid 1708512 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@969 -- # kill 1708512 00:06:50.392 12:02:19 rpc -- common/autotest_common.sh@974 -- # wait 1708512 00:06:50.651 00:06:50.651 real 0m1.955s 00:06:50.651 user 0m2.373s 00:06:50.651 sys 0m0.757s 00:06:50.651 12:02:19 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.651 12:02:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.651 ************************************ 00:06:50.651 END TEST rpc 00:06:50.651 ************************************ 00:06:50.651 12:02:19 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:50.651 12:02:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.651 12:02:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.651 12:02:19 -- common/autotest_common.sh@10 -- # set +x 00:06:50.651 ************************************ 00:06:50.651 START TEST skip_rpc 00:06:50.651 ************************************ 00:06:50.651 12:02:19 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:50.910 * Looking for test storage... 00:06:50.910 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:50.910 12:02:19 skip_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.911 12:02:19 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:50.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.911 --rc genhtml_branch_coverage=1 00:06:50.911 --rc genhtml_function_coverage=1 00:06:50.911 --rc genhtml_legend=1 00:06:50.911 --rc geninfo_all_blocks=1 00:06:50.911 --rc geninfo_unexecuted_blocks=1 00:06:50.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.911 ' 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:50.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.911 --rc genhtml_branch_coverage=1 00:06:50.911 --rc genhtml_function_coverage=1 00:06:50.911 --rc genhtml_legend=1 00:06:50.911 --rc geninfo_all_blocks=1 00:06:50.911 --rc geninfo_unexecuted_blocks=1 00:06:50.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.911 ' 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:50.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.911 --rc genhtml_branch_coverage=1 00:06:50.911 --rc genhtml_function_coverage=1 00:06:50.911 --rc genhtml_legend=1 00:06:50.911 --rc geninfo_all_blocks=1 00:06:50.911 --rc geninfo_unexecuted_blocks=1 00:06:50.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.911 ' 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:50.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.911 --rc genhtml_branch_coverage=1 00:06:50.911 --rc genhtml_function_coverage=1 00:06:50.911 --rc genhtml_legend=1 00:06:50.911 --rc geninfo_all_blocks=1 00:06:50.911 --rc geninfo_unexecuted_blocks=1 00:06:50.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.911 ' 00:06:50.911 12:02:19 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:50.911 12:02:19 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:50.911 12:02:19 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.911 12:02:19 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.911 ************************************ 00:06:50.911 START TEST skip_rpc 00:06:50.911 ************************************ 00:06:50.911 12:02:19 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:50.911 12:02:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1708978 00:06:50.911 12:02:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:50.911 12:02:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:50.911 12:02:19 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:50.911 [2024-11-27 12:02:19.739779] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:50.911 [2024-11-27 12:02:19.739837] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708978 ] 00:06:51.170 [2024-11-27 12:02:19.805332] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.170 [2024-11-27 12:02:19.843657] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1708978 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 1708978 ']' 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 1708978 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1708978 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1708978' 00:06:56.444 killing process with pid 1708978 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 1708978 00:06:56.444 12:02:24 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 1708978 00:06:56.444 00:06:56.444 real 0m5.392s 00:06:56.444 user 0m5.170s 00:06:56.444 sys 0m0.275s 00:06:56.444 12:02:25 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:56.444 12:02:25 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:56.444 ************************************ 00:06:56.444 END TEST skip_rpc 00:06:56.444 ************************************ 00:06:56.444 12:02:25 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:56.444 12:02:25 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:56.444 12:02:25 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:56.444 12:02:25 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:56.444 ************************************ 00:06:56.444 START TEST skip_rpc_with_json 00:06:56.444 ************************************ 00:06:56.444 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:56.444 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:56.444 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1709935 00:06:56.444 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:56.444 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1709935 00:06:56.444 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 1709935 ']' 00:06:56.445 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.445 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:56.445 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.445 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:56.445 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:56.445 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:56.445 [2024-11-27 12:02:25.209963] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:56.445 [2024-11-27 12:02:25.210019] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1709935 ] 00:06:56.445 [2024-11-27 12:02:25.276199] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.445 [2024-11-27 12:02:25.315798] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:56.704 [2024-11-27 12:02:25.518654] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:56.704 request: 00:06:56.704 { 00:06:56.704 "trtype": "tcp", 00:06:56.704 "method": "nvmf_get_transports", 00:06:56.704 "req_id": 1 00:06:56.704 } 00:06:56.704 Got JSON-RPC error response 00:06:56.704 response: 00:06:56.704 { 00:06:56.704 "code": -19, 00:06:56.704 "message": "No such device" 00:06:56.704 } 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:56.704 [2024-11-27 12:02:25.526743] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:56.704 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:56.963 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:56.963 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:56.963 { 00:06:56.963 "subsystems": [ 00:06:56.963 { 00:06:56.963 "subsystem": "scheduler", 00:06:56.963 "config": [ 00:06:56.963 { 00:06:56.963 "method": "framework_set_scheduler", 00:06:56.963 "params": { 00:06:56.963 "name": "static" 00:06:56.963 } 00:06:56.963 } 00:06:56.963 ] 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "vmd", 00:06:56.963 "config": [] 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "sock", 00:06:56.963 "config": [ 00:06:56.963 { 00:06:56.963 "method": "sock_set_default_impl", 00:06:56.963 "params": { 00:06:56.963 "impl_name": "posix" 00:06:56.963 } 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "method": "sock_impl_set_options", 00:06:56.963 "params": { 00:06:56.963 "impl_name": "ssl", 00:06:56.963 "recv_buf_size": 4096, 00:06:56.963 "send_buf_size": 4096, 00:06:56.963 "enable_recv_pipe": true, 00:06:56.963 "enable_quickack": false, 00:06:56.963 "enable_placement_id": 0, 00:06:56.963 "enable_zerocopy_send_server": true, 00:06:56.963 "enable_zerocopy_send_client": false, 00:06:56.963 "zerocopy_threshold": 0, 00:06:56.963 "tls_version": 0, 00:06:56.963 "enable_ktls": false 00:06:56.963 } 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "method": "sock_impl_set_options", 00:06:56.963 "params": { 00:06:56.963 "impl_name": "posix", 00:06:56.963 "recv_buf_size": 2097152, 00:06:56.963 "send_buf_size": 2097152, 00:06:56.963 "enable_recv_pipe": true, 00:06:56.963 "enable_quickack": false, 00:06:56.963 "enable_placement_id": 0, 00:06:56.963 "enable_zerocopy_send_server": true, 00:06:56.963 "enable_zerocopy_send_client": false, 00:06:56.963 "zerocopy_threshold": 0, 00:06:56.963 "tls_version": 0, 00:06:56.963 "enable_ktls": false 00:06:56.963 } 00:06:56.963 } 00:06:56.963 ] 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "iobuf", 00:06:56.963 "config": [ 00:06:56.963 { 00:06:56.963 "method": "iobuf_set_options", 00:06:56.963 "params": { 00:06:56.963 "small_pool_count": 8192, 00:06:56.963 "large_pool_count": 1024, 00:06:56.963 "small_bufsize": 8192, 00:06:56.963 "large_bufsize": 135168 00:06:56.963 } 00:06:56.963 } 00:06:56.963 ] 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "keyring", 00:06:56.963 "config": [] 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "vfio_user_target", 00:06:56.963 "config": null 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "fsdev", 00:06:56.963 "config": [ 00:06:56.963 { 00:06:56.963 "method": "fsdev_set_opts", 00:06:56.963 "params": { 00:06:56.963 "fsdev_io_pool_size": 65535, 00:06:56.963 "fsdev_io_cache_size": 256 00:06:56.963 } 00:06:56.963 } 00:06:56.963 ] 00:06:56.963 }, 00:06:56.963 { 00:06:56.963 "subsystem": "accel", 00:06:56.963 "config": [ 00:06:56.963 { 00:06:56.963 "method": "accel_set_options", 00:06:56.963 "params": { 00:06:56.963 "small_cache_size": 128, 00:06:56.963 "large_cache_size": 16, 00:06:56.963 "task_count": 2048, 00:06:56.963 "sequence_count": 2048, 00:06:56.963 "buf_count": 2048 00:06:56.963 } 00:06:56.964 } 00:06:56.964 ] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "bdev", 00:06:56.964 "config": [ 00:06:56.964 { 00:06:56.964 "method": "bdev_set_options", 00:06:56.964 "params": { 00:06:56.964 "bdev_io_pool_size": 65535, 00:06:56.964 "bdev_io_cache_size": 256, 00:06:56.964 "bdev_auto_examine": true, 00:06:56.964 "iobuf_small_cache_size": 128, 00:06:56.964 "iobuf_large_cache_size": 16 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "bdev_raid_set_options", 00:06:56.964 "params": { 00:06:56.964 "process_window_size_kb": 1024, 00:06:56.964 "process_max_bandwidth_mb_sec": 0 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "bdev_nvme_set_options", 00:06:56.964 "params": { 00:06:56.964 "action_on_timeout": "none", 00:06:56.964 "timeout_us": 0, 00:06:56.964 "timeout_admin_us": 0, 00:06:56.964 "keep_alive_timeout_ms": 10000, 00:06:56.964 "arbitration_burst": 0, 00:06:56.964 "low_priority_weight": 0, 00:06:56.964 "medium_priority_weight": 0, 00:06:56.964 "high_priority_weight": 0, 00:06:56.964 "nvme_adminq_poll_period_us": 10000, 00:06:56.964 "nvme_ioq_poll_period_us": 0, 00:06:56.964 "io_queue_requests": 0, 00:06:56.964 "delay_cmd_submit": true, 00:06:56.964 "transport_retry_count": 4, 00:06:56.964 "bdev_retry_count": 3, 00:06:56.964 "transport_ack_timeout": 0, 00:06:56.964 "ctrlr_loss_timeout_sec": 0, 00:06:56.964 "reconnect_delay_sec": 0, 00:06:56.964 "fast_io_fail_timeout_sec": 0, 00:06:56.964 "disable_auto_failback": false, 00:06:56.964 "generate_uuids": false, 00:06:56.964 "transport_tos": 0, 00:06:56.964 "nvme_error_stat": false, 00:06:56.964 "rdma_srq_size": 0, 00:06:56.964 "io_path_stat": false, 00:06:56.964 "allow_accel_sequence": false, 00:06:56.964 "rdma_max_cq_size": 0, 00:06:56.964 "rdma_cm_event_timeout_ms": 0, 00:06:56.964 "dhchap_digests": [ 00:06:56.964 "sha256", 00:06:56.964 "sha384", 00:06:56.964 "sha512" 00:06:56.964 ], 00:06:56.964 "dhchap_dhgroups": [ 00:06:56.964 "null", 00:06:56.964 "ffdhe2048", 00:06:56.964 "ffdhe3072", 00:06:56.964 "ffdhe4096", 00:06:56.964 "ffdhe6144", 00:06:56.964 "ffdhe8192" 00:06:56.964 ] 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "bdev_nvme_set_hotplug", 00:06:56.964 "params": { 00:06:56.964 "period_us": 100000, 00:06:56.964 "enable": false 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "bdev_iscsi_set_options", 00:06:56.964 "params": { 00:06:56.964 "timeout_sec": 30 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "bdev_wait_for_examine" 00:06:56.964 } 00:06:56.964 ] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "nvmf", 00:06:56.964 "config": [ 00:06:56.964 { 00:06:56.964 "method": "nvmf_set_config", 00:06:56.964 "params": { 00:06:56.964 "discovery_filter": "match_any", 00:06:56.964 "admin_cmd_passthru": { 00:06:56.964 "identify_ctrlr": false 00:06:56.964 }, 00:06:56.964 "dhchap_digests": [ 00:06:56.964 "sha256", 00:06:56.964 "sha384", 00:06:56.964 "sha512" 00:06:56.964 ], 00:06:56.964 "dhchap_dhgroups": [ 00:06:56.964 "null", 00:06:56.964 "ffdhe2048", 00:06:56.964 "ffdhe3072", 00:06:56.964 "ffdhe4096", 00:06:56.964 "ffdhe6144", 00:06:56.964 "ffdhe8192" 00:06:56.964 ] 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "nvmf_set_max_subsystems", 00:06:56.964 "params": { 00:06:56.964 "max_subsystems": 1024 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "nvmf_set_crdt", 00:06:56.964 "params": { 00:06:56.964 "crdt1": 0, 00:06:56.964 "crdt2": 0, 00:06:56.964 "crdt3": 0 00:06:56.964 } 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "method": "nvmf_create_transport", 00:06:56.964 "params": { 00:06:56.964 "trtype": "TCP", 00:06:56.964 "max_queue_depth": 128, 00:06:56.964 "max_io_qpairs_per_ctrlr": 127, 00:06:56.964 "in_capsule_data_size": 4096, 00:06:56.964 "max_io_size": 131072, 00:06:56.964 "io_unit_size": 131072, 00:06:56.964 "max_aq_depth": 128, 00:06:56.964 "num_shared_buffers": 511, 00:06:56.964 "buf_cache_size": 4294967295, 00:06:56.964 "dif_insert_or_strip": false, 00:06:56.964 "zcopy": false, 00:06:56.964 "c2h_success": true, 00:06:56.964 "sock_priority": 0, 00:06:56.964 "abort_timeout_sec": 1, 00:06:56.964 "ack_timeout": 0, 00:06:56.964 "data_wr_pool_size": 0 00:06:56.964 } 00:06:56.964 } 00:06:56.964 ] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "nbd", 00:06:56.964 "config": [] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "ublk", 00:06:56.964 "config": [] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "vhost_blk", 00:06:56.964 "config": [] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "scsi", 00:06:56.964 "config": null 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "iscsi", 00:06:56.964 "config": [ 00:06:56.964 { 00:06:56.964 "method": "iscsi_set_options", 00:06:56.964 "params": { 00:06:56.964 "node_base": "iqn.2016-06.io.spdk", 00:06:56.964 "max_sessions": 128, 00:06:56.964 "max_connections_per_session": 2, 00:06:56.964 "max_queue_depth": 64, 00:06:56.964 "default_time2wait": 2, 00:06:56.964 "default_time2retain": 20, 00:06:56.964 "first_burst_length": 8192, 00:06:56.964 "immediate_data": true, 00:06:56.964 "allow_duplicated_isid": false, 00:06:56.964 "error_recovery_level": 0, 00:06:56.964 "nop_timeout": 60, 00:06:56.964 "nop_in_interval": 30, 00:06:56.964 "disable_chap": false, 00:06:56.964 "require_chap": false, 00:06:56.964 "mutual_chap": false, 00:06:56.964 "chap_group": 0, 00:06:56.964 "max_large_datain_per_connection": 64, 00:06:56.964 "max_r2t_per_connection": 4, 00:06:56.964 "pdu_pool_size": 36864, 00:06:56.964 "immediate_data_pool_size": 16384, 00:06:56.964 "data_out_pool_size": 2048 00:06:56.964 } 00:06:56.964 } 00:06:56.964 ] 00:06:56.964 }, 00:06:56.964 { 00:06:56.964 "subsystem": "vhost_scsi", 00:06:56.964 "config": [] 00:06:56.964 } 00:06:56.964 ] 00:06:56.964 } 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1709935 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1709935 ']' 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1709935 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1709935 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1709935' 00:06:56.964 killing process with pid 1709935 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1709935 00:06:56.964 12:02:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1709935 00:06:57.223 12:02:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1710083 00:06:57.223 12:02:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:57.223 12:02:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1710083 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1710083 ']' 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1710083 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1710083 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1710083' 00:07:02.506 killing process with pid 1710083 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1710083 00:07:02.506 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1710083 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:02.766 00:07:02.766 real 0m6.250s 00:07:02.766 user 0m5.910s 00:07:02.766 sys 0m0.620s 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:02.766 ************************************ 00:07:02.766 END TEST skip_rpc_with_json 00:07:02.766 ************************************ 00:07:02.766 12:02:31 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:02.766 12:02:31 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:02.766 12:02:31 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.766 12:02:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:02.766 ************************************ 00:07:02.766 START TEST skip_rpc_with_delay 00:07:02.766 ************************************ 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:02.766 [2024-11-27 12:02:31.548098] app.c: 840:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:02.766 [2024-11-27 12:02:31.548202] app.c: 719:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:02.766 00:07:02.766 real 0m0.045s 00:07:02.766 user 0m0.019s 00:07:02.766 sys 0m0.026s 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:02.766 12:02:31 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:02.766 ************************************ 00:07:02.766 END TEST skip_rpc_with_delay 00:07:02.766 ************************************ 00:07:02.766 12:02:31 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:02.766 12:02:31 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:02.766 12:02:31 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:02.766 12:02:31 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:02.766 12:02:31 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.766 12:02:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:02.766 ************************************ 00:07:02.766 START TEST exit_on_failed_rpc_init 00:07:02.766 ************************************ 00:07:02.766 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1711194 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1711194 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 1711194 ']' 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:03.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:03.026 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:03.026 [2024-11-27 12:02:31.677704] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:03.026 [2024-11-27 12:02:31.677787] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711194 ] 00:07:03.026 [2024-11-27 12:02:31.746042] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.026 [2024-11-27 12:02:31.783765] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:03.285 12:02:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:03.285 [2024-11-27 12:02:31.999174] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:03.285 [2024-11-27 12:02:31.999242] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711204 ] 00:07:03.285 [2024-11-27 12:02:32.064422] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.285 [2024-11-27 12:02:32.102725] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:03.285 [2024-11-27 12:02:32.102814] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:03.285 [2024-11-27 12:02:32.102827] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:03.285 [2024-11-27 12:02:32.102835] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:03.285 12:02:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:03.286 12:02:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1711194 00:07:03.286 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 1711194 ']' 00:07:03.286 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 1711194 00:07:03.286 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1711194 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1711194' 00:07:03.546 killing process with pid 1711194 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 1711194 00:07:03.546 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 1711194 00:07:03.806 00:07:03.806 real 0m0.884s 00:07:03.806 user 0m0.883s 00:07:03.806 sys 0m0.432s 00:07:03.806 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:03.806 12:02:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:03.806 ************************************ 00:07:03.806 END TEST exit_on_failed_rpc_init 00:07:03.806 ************************************ 00:07:03.806 12:02:32 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:03.806 00:07:03.806 real 0m13.070s 00:07:03.806 user 0m12.191s 00:07:03.806 sys 0m1.673s 00:07:03.806 12:02:32 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:03.806 12:02:32 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.806 ************************************ 00:07:03.806 END TEST skip_rpc 00:07:03.806 ************************************ 00:07:03.806 12:02:32 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:03.806 12:02:32 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:03.806 12:02:32 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:03.806 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:07:03.806 ************************************ 00:07:03.806 START TEST rpc_client 00:07:03.806 ************************************ 00:07:03.806 12:02:32 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:04.066 * Looking for test storage... 00:07:04.066 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1681 -- # lcov --version 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@345 -- # : 1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@353 -- # local d=1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@355 -- # echo 1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@353 -- # local d=2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@355 -- # echo 2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:04.066 12:02:32 rpc_client -- scripts/common.sh@368 -- # return 0 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:04.066 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.066 --rc genhtml_branch_coverage=1 00:07:04.066 --rc genhtml_function_coverage=1 00:07:04.066 --rc genhtml_legend=1 00:07:04.066 --rc geninfo_all_blocks=1 00:07:04.066 --rc geninfo_unexecuted_blocks=1 00:07:04.066 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.066 ' 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:04.066 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.066 --rc genhtml_branch_coverage=1 00:07:04.066 --rc genhtml_function_coverage=1 00:07:04.066 --rc genhtml_legend=1 00:07:04.066 --rc geninfo_all_blocks=1 00:07:04.066 --rc geninfo_unexecuted_blocks=1 00:07:04.066 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.066 ' 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:04.066 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.066 --rc genhtml_branch_coverage=1 00:07:04.066 --rc genhtml_function_coverage=1 00:07:04.066 --rc genhtml_legend=1 00:07:04.066 --rc geninfo_all_blocks=1 00:07:04.066 --rc geninfo_unexecuted_blocks=1 00:07:04.066 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.066 ' 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:04.066 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.066 --rc genhtml_branch_coverage=1 00:07:04.066 --rc genhtml_function_coverage=1 00:07:04.066 --rc genhtml_legend=1 00:07:04.066 --rc geninfo_all_blocks=1 00:07:04.066 --rc geninfo_unexecuted_blocks=1 00:07:04.066 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.066 ' 00:07:04.066 12:02:32 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:04.066 OK 00:07:04.066 12:02:32 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:04.066 00:07:04.066 real 0m0.220s 00:07:04.066 user 0m0.119s 00:07:04.066 sys 0m0.120s 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:04.066 12:02:32 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:04.066 ************************************ 00:07:04.066 END TEST rpc_client 00:07:04.066 ************************************ 00:07:04.066 12:02:32 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:04.066 12:02:32 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:04.067 12:02:32 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:04.067 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:07:04.326 ************************************ 00:07:04.326 START TEST json_config 00:07:04.326 ************************************ 00:07:04.326 12:02:32 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:04.326 12:02:33 json_config -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:04.326 12:02:33 json_config -- common/autotest_common.sh@1681 -- # lcov --version 00:07:04.326 12:02:33 json_config -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:04.326 12:02:33 json_config -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:04.326 12:02:33 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:04.326 12:02:33 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.326 12:02:33 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:07:04.326 12:02:33 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:07:04.326 12:02:33 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:04.326 12:02:33 json_config -- scripts/common.sh@344 -- # case "$op" in 00:07:04.326 12:02:33 json_config -- scripts/common.sh@345 -- # : 1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:04.326 12:02:33 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.326 12:02:33 json_config -- scripts/common.sh@365 -- # decimal 1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@353 -- # local d=1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.326 12:02:33 json_config -- scripts/common.sh@355 -- # echo 1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:07:04.326 12:02:33 json_config -- scripts/common.sh@366 -- # decimal 2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@353 -- # local d=2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.326 12:02:33 json_config -- scripts/common.sh@355 -- # echo 2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:07:04.326 12:02:33 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:04.326 12:02:33 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:04.326 12:02:33 json_config -- scripts/common.sh@368 -- # return 0 00:07:04.326 12:02:33 json_config -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.326 12:02:33 json_config -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:04.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.327 --rc genhtml_branch_coverage=1 00:07:04.327 --rc genhtml_function_coverage=1 00:07:04.327 --rc genhtml_legend=1 00:07:04.327 --rc geninfo_all_blocks=1 00:07:04.327 --rc geninfo_unexecuted_blocks=1 00:07:04.327 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.327 ' 00:07:04.327 12:02:33 json_config -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:04.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.327 --rc genhtml_branch_coverage=1 00:07:04.327 --rc genhtml_function_coverage=1 00:07:04.327 --rc genhtml_legend=1 00:07:04.327 --rc geninfo_all_blocks=1 00:07:04.327 --rc geninfo_unexecuted_blocks=1 00:07:04.327 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.327 ' 00:07:04.327 12:02:33 json_config -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:04.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.327 --rc genhtml_branch_coverage=1 00:07:04.327 --rc genhtml_function_coverage=1 00:07:04.327 --rc genhtml_legend=1 00:07:04.327 --rc geninfo_all_blocks=1 00:07:04.327 --rc geninfo_unexecuted_blocks=1 00:07:04.327 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.327 ' 00:07:04.327 12:02:33 json_config -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:04.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.327 --rc genhtml_branch_coverage=1 00:07:04.327 --rc genhtml_function_coverage=1 00:07:04.327 --rc genhtml_legend=1 00:07:04.327 --rc geninfo_all_blocks=1 00:07:04.327 --rc geninfo_unexecuted_blocks=1 00:07:04.327 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.327 ' 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:04.327 12:02:33 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:07:04.327 12:02:33 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:04.327 12:02:33 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:04.327 12:02:33 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:04.327 12:02:33 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.327 12:02:33 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.327 12:02:33 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.327 12:02:33 json_config -- paths/export.sh@5 -- # export PATH 00:07:04.327 12:02:33 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@51 -- # : 0 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:04.327 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:04.327 12:02:33 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:04.327 WARNING: No tests are enabled so not running JSON configuration tests 00:07:04.327 12:02:33 json_config -- json_config/json_config.sh@28 -- # exit 0 00:07:04.327 00:07:04.327 real 0m0.163s 00:07:04.327 user 0m0.101s 00:07:04.327 sys 0m0.067s 00:07:04.327 12:02:33 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:04.327 12:02:33 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:04.327 ************************************ 00:07:04.327 END TEST json_config 00:07:04.327 ************************************ 00:07:04.327 12:02:33 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:04.327 12:02:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:04.327 12:02:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:04.327 12:02:33 -- common/autotest_common.sh@10 -- # set +x 00:07:04.327 ************************************ 00:07:04.327 START TEST json_config_extra_key 00:07:04.327 ************************************ 00:07:04.327 12:02:33 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1681 -- # lcov --version 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:04.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.587 --rc genhtml_branch_coverage=1 00:07:04.587 --rc genhtml_function_coverage=1 00:07:04.587 --rc genhtml_legend=1 00:07:04.587 --rc geninfo_all_blocks=1 00:07:04.587 --rc geninfo_unexecuted_blocks=1 00:07:04.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.587 ' 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:04.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.587 --rc genhtml_branch_coverage=1 00:07:04.587 --rc genhtml_function_coverage=1 00:07:04.587 --rc genhtml_legend=1 00:07:04.587 --rc geninfo_all_blocks=1 00:07:04.587 --rc geninfo_unexecuted_blocks=1 00:07:04.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.587 ' 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:04.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.587 --rc genhtml_branch_coverage=1 00:07:04.587 --rc genhtml_function_coverage=1 00:07:04.587 --rc genhtml_legend=1 00:07:04.587 --rc geninfo_all_blocks=1 00:07:04.587 --rc geninfo_unexecuted_blocks=1 00:07:04.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.587 ' 00:07:04.587 12:02:33 json_config_extra_key -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:04.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.587 --rc genhtml_branch_coverage=1 00:07:04.587 --rc genhtml_function_coverage=1 00:07:04.587 --rc genhtml_legend=1 00:07:04.587 --rc geninfo_all_blocks=1 00:07:04.587 --rc geninfo_unexecuted_blocks=1 00:07:04.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.587 ' 00:07:04.587 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:04.587 12:02:33 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:04.587 12:02:33 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:04.588 12:02:33 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.588 12:02:33 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.588 12:02:33 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.588 12:02:33 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:04.588 12:02:33 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:04.588 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:04.588 12:02:33 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:04.588 INFO: launching applications... 00:07:04.588 12:02:33 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1711636 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:04.588 Waiting for target to run... 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:04.588 12:02:33 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1711636 /var/tmp/spdk_tgt.sock 00:07:04.588 12:02:33 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 1711636 ']' 00:07:04.588 12:02:33 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:04.588 12:02:33 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:04.588 12:02:33 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:04.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:04.588 12:02:33 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:04.588 12:02:33 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:04.588 [2024-11-27 12:02:33.400163] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:04.588 [2024-11-27 12:02:33.400225] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711636 ] 00:07:04.847 [2024-11-27 12:02:33.672026] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.847 [2024-11-27 12:02:33.693358] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.416 12:02:34 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:05.417 12:02:34 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:05.417 00:07:05.417 12:02:34 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:05.417 INFO: shutting down applications... 00:07:05.417 12:02:34 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1711636 ]] 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1711636 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1711636 00:07:05.417 12:02:34 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1711636 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:05.986 12:02:34 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:05.986 SPDK target shutdown done 00:07:05.986 12:02:34 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:05.986 Success 00:07:05.986 00:07:05.986 real 0m1.549s 00:07:05.986 user 0m1.321s 00:07:05.986 sys 0m0.399s 00:07:05.986 12:02:34 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:05.986 12:02:34 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:05.986 ************************************ 00:07:05.986 END TEST json_config_extra_key 00:07:05.986 ************************************ 00:07:05.986 12:02:34 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:05.986 12:02:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:05.986 12:02:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:05.986 12:02:34 -- common/autotest_common.sh@10 -- # set +x 00:07:05.986 ************************************ 00:07:05.986 START TEST alias_rpc 00:07:05.986 ************************************ 00:07:05.986 12:02:34 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:06.247 * Looking for test storage... 00:07:06.247 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:06.247 12:02:34 alias_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:06.247 12:02:34 alias_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:07:06.247 12:02:34 alias_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@345 -- # : 1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:06.247 12:02:35 alias_rpc -- scripts/common.sh@368 -- # return 0 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:06.247 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.247 --rc genhtml_branch_coverage=1 00:07:06.247 --rc genhtml_function_coverage=1 00:07:06.247 --rc genhtml_legend=1 00:07:06.247 --rc geninfo_all_blocks=1 00:07:06.247 --rc geninfo_unexecuted_blocks=1 00:07:06.247 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.247 ' 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:06.247 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.247 --rc genhtml_branch_coverage=1 00:07:06.247 --rc genhtml_function_coverage=1 00:07:06.247 --rc genhtml_legend=1 00:07:06.247 --rc geninfo_all_blocks=1 00:07:06.247 --rc geninfo_unexecuted_blocks=1 00:07:06.247 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.247 ' 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:06.247 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.247 --rc genhtml_branch_coverage=1 00:07:06.247 --rc genhtml_function_coverage=1 00:07:06.247 --rc genhtml_legend=1 00:07:06.247 --rc geninfo_all_blocks=1 00:07:06.247 --rc geninfo_unexecuted_blocks=1 00:07:06.247 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.247 ' 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:06.247 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.247 --rc genhtml_branch_coverage=1 00:07:06.247 --rc genhtml_function_coverage=1 00:07:06.247 --rc genhtml_legend=1 00:07:06.247 --rc geninfo_all_blocks=1 00:07:06.247 --rc geninfo_unexecuted_blocks=1 00:07:06.247 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.247 ' 00:07:06.247 12:02:35 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:06.247 12:02:35 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1711962 00:07:06.247 12:02:35 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1711962 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 1711962 ']' 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:06.247 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:06.247 12:02:35 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.247 12:02:35 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.247 [2024-11-27 12:02:35.051989] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:06.247 [2024-11-27 12:02:35.052051] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711962 ] 00:07:06.247 [2024-11-27 12:02:35.118291] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.506 [2024-11-27 12:02:35.159013] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.506 12:02:35 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:06.506 12:02:35 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:06.506 12:02:35 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:06.766 12:02:35 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1711962 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 1711962 ']' 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 1711962 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1711962 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1711962' 00:07:06.766 killing process with pid 1711962 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@969 -- # kill 1711962 00:07:06.766 12:02:35 alias_rpc -- common/autotest_common.sh@974 -- # wait 1711962 00:07:07.335 00:07:07.335 real 0m1.069s 00:07:07.335 user 0m1.038s 00:07:07.335 sys 0m0.440s 00:07:07.335 12:02:35 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:07.335 12:02:35 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:07.335 ************************************ 00:07:07.335 END TEST alias_rpc 00:07:07.335 ************************************ 00:07:07.335 12:02:35 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:07:07.335 12:02:35 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:07.335 12:02:35 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:07.335 12:02:35 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:07.335 12:02:35 -- common/autotest_common.sh@10 -- # set +x 00:07:07.335 ************************************ 00:07:07.335 START TEST spdkcli_tcp 00:07:07.335 ************************************ 00:07:07.335 12:02:35 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:07.335 * Looking for test storage... 00:07:07.335 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lcov --version 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:07.336 12:02:36 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:07.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.336 --rc genhtml_branch_coverage=1 00:07:07.336 --rc genhtml_function_coverage=1 00:07:07.336 --rc genhtml_legend=1 00:07:07.336 --rc geninfo_all_blocks=1 00:07:07.336 --rc geninfo_unexecuted_blocks=1 00:07:07.336 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.336 ' 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:07.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.336 --rc genhtml_branch_coverage=1 00:07:07.336 --rc genhtml_function_coverage=1 00:07:07.336 --rc genhtml_legend=1 00:07:07.336 --rc geninfo_all_blocks=1 00:07:07.336 --rc geninfo_unexecuted_blocks=1 00:07:07.336 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.336 ' 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:07.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.336 --rc genhtml_branch_coverage=1 00:07:07.336 --rc genhtml_function_coverage=1 00:07:07.336 --rc genhtml_legend=1 00:07:07.336 --rc geninfo_all_blocks=1 00:07:07.336 --rc geninfo_unexecuted_blocks=1 00:07:07.336 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.336 ' 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:07.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.336 --rc genhtml_branch_coverage=1 00:07:07.336 --rc genhtml_function_coverage=1 00:07:07.336 --rc genhtml_legend=1 00:07:07.336 --rc geninfo_all_blocks=1 00:07:07.336 --rc geninfo_unexecuted_blocks=1 00:07:07.336 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.336 ' 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1712285 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:07.336 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1712285 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 1712285 ']' 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:07.336 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:07.336 12:02:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:07.595 [2024-11-27 12:02:36.225858] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:07.595 [2024-11-27 12:02:36.225923] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712285 ] 00:07:07.595 [2024-11-27 12:02:36.290590] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:07.595 [2024-11-27 12:02:36.333618] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.595 [2024-11-27 12:02:36.333620] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.855 12:02:36 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:07.855 12:02:36 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:07:07.855 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1712289 00:07:07.855 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:07.855 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:07.855 [ 00:07:07.855 "spdk_get_version", 00:07:07.855 "rpc_get_methods", 00:07:07.855 "notify_get_notifications", 00:07:07.855 "notify_get_types", 00:07:07.855 "trace_get_info", 00:07:07.855 "trace_get_tpoint_group_mask", 00:07:07.855 "trace_disable_tpoint_group", 00:07:07.855 "trace_enable_tpoint_group", 00:07:07.855 "trace_clear_tpoint_mask", 00:07:07.855 "trace_set_tpoint_mask", 00:07:07.855 "fsdev_set_opts", 00:07:07.855 "fsdev_get_opts", 00:07:07.855 "framework_get_pci_devices", 00:07:07.855 "framework_get_config", 00:07:07.855 "framework_get_subsystems", 00:07:07.855 "vfu_tgt_set_base_path", 00:07:07.855 "keyring_get_keys", 00:07:07.855 "iobuf_get_stats", 00:07:07.855 "iobuf_set_options", 00:07:07.855 "sock_get_default_impl", 00:07:07.855 "sock_set_default_impl", 00:07:07.855 "sock_impl_set_options", 00:07:07.855 "sock_impl_get_options", 00:07:07.855 "vmd_rescan", 00:07:07.855 "vmd_remove_device", 00:07:07.855 "vmd_enable", 00:07:07.855 "accel_get_stats", 00:07:07.855 "accel_set_options", 00:07:07.855 "accel_set_driver", 00:07:07.855 "accel_crypto_key_destroy", 00:07:07.855 "accel_crypto_keys_get", 00:07:07.855 "accel_crypto_key_create", 00:07:07.855 "accel_assign_opc", 00:07:07.855 "accel_get_module_info", 00:07:07.855 "accel_get_opc_assignments", 00:07:07.855 "bdev_get_histogram", 00:07:07.855 "bdev_enable_histogram", 00:07:07.855 "bdev_set_qos_limit", 00:07:07.855 "bdev_set_qd_sampling_period", 00:07:07.855 "bdev_get_bdevs", 00:07:07.855 "bdev_reset_iostat", 00:07:07.855 "bdev_get_iostat", 00:07:07.855 "bdev_examine", 00:07:07.855 "bdev_wait_for_examine", 00:07:07.855 "bdev_set_options", 00:07:07.855 "scsi_get_devices", 00:07:07.855 "thread_set_cpumask", 00:07:07.855 "scheduler_set_options", 00:07:07.855 "framework_get_governor", 00:07:07.855 "framework_get_scheduler", 00:07:07.855 "framework_set_scheduler", 00:07:07.855 "framework_get_reactors", 00:07:07.855 "thread_get_io_channels", 00:07:07.855 "thread_get_pollers", 00:07:07.855 "thread_get_stats", 00:07:07.855 "framework_monitor_context_switch", 00:07:07.855 "spdk_kill_instance", 00:07:07.855 "log_enable_timestamps", 00:07:07.855 "log_get_flags", 00:07:07.855 "log_clear_flag", 00:07:07.855 "log_set_flag", 00:07:07.855 "log_get_level", 00:07:07.855 "log_set_level", 00:07:07.855 "log_get_print_level", 00:07:07.855 "log_set_print_level", 00:07:07.855 "framework_enable_cpumask_locks", 00:07:07.855 "framework_disable_cpumask_locks", 00:07:07.855 "framework_wait_init", 00:07:07.855 "framework_start_init", 00:07:07.855 "virtio_blk_create_transport", 00:07:07.855 "virtio_blk_get_transports", 00:07:07.855 "vhost_controller_set_coalescing", 00:07:07.855 "vhost_get_controllers", 00:07:07.855 "vhost_delete_controller", 00:07:07.855 "vhost_create_blk_controller", 00:07:07.855 "vhost_scsi_controller_remove_target", 00:07:07.855 "vhost_scsi_controller_add_target", 00:07:07.855 "vhost_start_scsi_controller", 00:07:07.855 "vhost_create_scsi_controller", 00:07:07.855 "ublk_recover_disk", 00:07:07.855 "ublk_get_disks", 00:07:07.855 "ublk_stop_disk", 00:07:07.855 "ublk_start_disk", 00:07:07.855 "ublk_destroy_target", 00:07:07.855 "ublk_create_target", 00:07:07.855 "nbd_get_disks", 00:07:07.855 "nbd_stop_disk", 00:07:07.855 "nbd_start_disk", 00:07:07.855 "env_dpdk_get_mem_stats", 00:07:07.855 "nvmf_stop_mdns_prr", 00:07:07.855 "nvmf_publish_mdns_prr", 00:07:07.855 "nvmf_subsystem_get_listeners", 00:07:07.855 "nvmf_subsystem_get_qpairs", 00:07:07.855 "nvmf_subsystem_get_controllers", 00:07:07.855 "nvmf_get_stats", 00:07:07.855 "nvmf_get_transports", 00:07:07.855 "nvmf_create_transport", 00:07:07.855 "nvmf_get_targets", 00:07:07.855 "nvmf_delete_target", 00:07:07.855 "nvmf_create_target", 00:07:07.855 "nvmf_subsystem_allow_any_host", 00:07:07.855 "nvmf_subsystem_set_keys", 00:07:07.855 "nvmf_subsystem_remove_host", 00:07:07.855 "nvmf_subsystem_add_host", 00:07:07.855 "nvmf_ns_remove_host", 00:07:07.855 "nvmf_ns_add_host", 00:07:07.855 "nvmf_subsystem_remove_ns", 00:07:07.855 "nvmf_subsystem_set_ns_ana_group", 00:07:07.855 "nvmf_subsystem_add_ns", 00:07:07.855 "nvmf_subsystem_listener_set_ana_state", 00:07:07.855 "nvmf_discovery_get_referrals", 00:07:07.855 "nvmf_discovery_remove_referral", 00:07:07.855 "nvmf_discovery_add_referral", 00:07:07.855 "nvmf_subsystem_remove_listener", 00:07:07.855 "nvmf_subsystem_add_listener", 00:07:07.855 "nvmf_delete_subsystem", 00:07:07.855 "nvmf_create_subsystem", 00:07:07.855 "nvmf_get_subsystems", 00:07:07.855 "nvmf_set_crdt", 00:07:07.855 "nvmf_set_config", 00:07:07.855 "nvmf_set_max_subsystems", 00:07:07.855 "iscsi_get_histogram", 00:07:07.855 "iscsi_enable_histogram", 00:07:07.855 "iscsi_set_options", 00:07:07.855 "iscsi_get_auth_groups", 00:07:07.855 "iscsi_auth_group_remove_secret", 00:07:07.855 "iscsi_auth_group_add_secret", 00:07:07.856 "iscsi_delete_auth_group", 00:07:07.856 "iscsi_create_auth_group", 00:07:07.856 "iscsi_set_discovery_auth", 00:07:07.856 "iscsi_get_options", 00:07:07.856 "iscsi_target_node_request_logout", 00:07:07.856 "iscsi_target_node_set_redirect", 00:07:07.856 "iscsi_target_node_set_auth", 00:07:07.856 "iscsi_target_node_add_lun", 00:07:07.856 "iscsi_get_stats", 00:07:07.856 "iscsi_get_connections", 00:07:07.856 "iscsi_portal_group_set_auth", 00:07:07.856 "iscsi_start_portal_group", 00:07:07.856 "iscsi_delete_portal_group", 00:07:07.856 "iscsi_create_portal_group", 00:07:07.856 "iscsi_get_portal_groups", 00:07:07.856 "iscsi_delete_target_node", 00:07:07.856 "iscsi_target_node_remove_pg_ig_maps", 00:07:07.856 "iscsi_target_node_add_pg_ig_maps", 00:07:07.856 "iscsi_create_target_node", 00:07:07.856 "iscsi_get_target_nodes", 00:07:07.856 "iscsi_delete_initiator_group", 00:07:07.856 "iscsi_initiator_group_remove_initiators", 00:07:07.856 "iscsi_initiator_group_add_initiators", 00:07:07.856 "iscsi_create_initiator_group", 00:07:07.856 "iscsi_get_initiator_groups", 00:07:07.856 "fsdev_aio_delete", 00:07:07.856 "fsdev_aio_create", 00:07:07.856 "keyring_linux_set_options", 00:07:07.856 "keyring_file_remove_key", 00:07:07.856 "keyring_file_add_key", 00:07:07.856 "vfu_virtio_create_fs_endpoint", 00:07:07.856 "vfu_virtio_create_scsi_endpoint", 00:07:07.856 "vfu_virtio_scsi_remove_target", 00:07:07.856 "vfu_virtio_scsi_add_target", 00:07:07.856 "vfu_virtio_create_blk_endpoint", 00:07:07.856 "vfu_virtio_delete_endpoint", 00:07:07.856 "iaa_scan_accel_module", 00:07:07.856 "dsa_scan_accel_module", 00:07:07.856 "ioat_scan_accel_module", 00:07:07.856 "accel_error_inject_error", 00:07:07.856 "bdev_iscsi_delete", 00:07:07.856 "bdev_iscsi_create", 00:07:07.856 "bdev_iscsi_set_options", 00:07:07.856 "bdev_virtio_attach_controller", 00:07:07.856 "bdev_virtio_scsi_get_devices", 00:07:07.856 "bdev_virtio_detach_controller", 00:07:07.856 "bdev_virtio_blk_set_hotplug", 00:07:07.856 "bdev_ftl_set_property", 00:07:07.856 "bdev_ftl_get_properties", 00:07:07.856 "bdev_ftl_get_stats", 00:07:07.856 "bdev_ftl_unmap", 00:07:07.856 "bdev_ftl_unload", 00:07:07.856 "bdev_ftl_delete", 00:07:07.856 "bdev_ftl_load", 00:07:07.856 "bdev_ftl_create", 00:07:07.856 "bdev_aio_delete", 00:07:07.856 "bdev_aio_rescan", 00:07:07.856 "bdev_aio_create", 00:07:07.856 "blobfs_create", 00:07:07.856 "blobfs_detect", 00:07:07.856 "blobfs_set_cache_size", 00:07:07.856 "bdev_zone_block_delete", 00:07:07.856 "bdev_zone_block_create", 00:07:07.856 "bdev_delay_delete", 00:07:07.856 "bdev_delay_create", 00:07:07.856 "bdev_delay_update_latency", 00:07:07.856 "bdev_split_delete", 00:07:07.856 "bdev_split_create", 00:07:07.856 "bdev_error_inject_error", 00:07:07.856 "bdev_error_delete", 00:07:07.856 "bdev_error_create", 00:07:07.856 "bdev_raid_set_options", 00:07:07.856 "bdev_raid_remove_base_bdev", 00:07:07.856 "bdev_raid_add_base_bdev", 00:07:07.856 "bdev_raid_delete", 00:07:07.856 "bdev_raid_create", 00:07:07.856 "bdev_raid_get_bdevs", 00:07:07.856 "bdev_lvol_set_parent_bdev", 00:07:07.856 "bdev_lvol_set_parent", 00:07:07.856 "bdev_lvol_check_shallow_copy", 00:07:07.856 "bdev_lvol_start_shallow_copy", 00:07:07.856 "bdev_lvol_grow_lvstore", 00:07:07.856 "bdev_lvol_get_lvols", 00:07:07.856 "bdev_lvol_get_lvstores", 00:07:07.856 "bdev_lvol_delete", 00:07:07.856 "bdev_lvol_set_read_only", 00:07:07.856 "bdev_lvol_resize", 00:07:07.856 "bdev_lvol_decouple_parent", 00:07:07.856 "bdev_lvol_inflate", 00:07:07.856 "bdev_lvol_rename", 00:07:07.856 "bdev_lvol_clone_bdev", 00:07:07.856 "bdev_lvol_clone", 00:07:07.856 "bdev_lvol_snapshot", 00:07:07.856 "bdev_lvol_create", 00:07:07.856 "bdev_lvol_delete_lvstore", 00:07:07.856 "bdev_lvol_rename_lvstore", 00:07:07.856 "bdev_lvol_create_lvstore", 00:07:07.856 "bdev_passthru_delete", 00:07:07.856 "bdev_passthru_create", 00:07:07.856 "bdev_nvme_cuse_unregister", 00:07:07.856 "bdev_nvme_cuse_register", 00:07:07.856 "bdev_opal_new_user", 00:07:07.856 "bdev_opal_set_lock_state", 00:07:07.856 "bdev_opal_delete", 00:07:07.856 "bdev_opal_get_info", 00:07:07.856 "bdev_opal_create", 00:07:07.856 "bdev_nvme_opal_revert", 00:07:07.856 "bdev_nvme_opal_init", 00:07:07.856 "bdev_nvme_send_cmd", 00:07:07.856 "bdev_nvme_set_keys", 00:07:07.856 "bdev_nvme_get_path_iostat", 00:07:07.856 "bdev_nvme_get_mdns_discovery_info", 00:07:07.856 "bdev_nvme_stop_mdns_discovery", 00:07:07.856 "bdev_nvme_start_mdns_discovery", 00:07:07.856 "bdev_nvme_set_multipath_policy", 00:07:07.856 "bdev_nvme_set_preferred_path", 00:07:07.856 "bdev_nvme_get_io_paths", 00:07:07.856 "bdev_nvme_remove_error_injection", 00:07:07.856 "bdev_nvme_add_error_injection", 00:07:07.856 "bdev_nvme_get_discovery_info", 00:07:07.856 "bdev_nvme_stop_discovery", 00:07:07.856 "bdev_nvme_start_discovery", 00:07:07.856 "bdev_nvme_get_controller_health_info", 00:07:07.856 "bdev_nvme_disable_controller", 00:07:07.856 "bdev_nvme_enable_controller", 00:07:07.856 "bdev_nvme_reset_controller", 00:07:07.856 "bdev_nvme_get_transport_statistics", 00:07:07.856 "bdev_nvme_apply_firmware", 00:07:07.856 "bdev_nvme_detach_controller", 00:07:07.856 "bdev_nvme_get_controllers", 00:07:07.856 "bdev_nvme_attach_controller", 00:07:07.856 "bdev_nvme_set_hotplug", 00:07:07.856 "bdev_nvme_set_options", 00:07:07.856 "bdev_null_resize", 00:07:07.856 "bdev_null_delete", 00:07:07.856 "bdev_null_create", 00:07:07.856 "bdev_malloc_delete", 00:07:07.856 "bdev_malloc_create" 00:07:07.856 ] 00:07:07.856 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:07.856 12:02:36 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:07.856 12:02:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:08.116 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:08.116 12:02:36 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1712285 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 1712285 ']' 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 1712285 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1712285 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1712285' 00:07:08.116 killing process with pid 1712285 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 1712285 00:07:08.116 12:02:36 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 1712285 00:07:08.375 00:07:08.375 real 0m1.133s 00:07:08.375 user 0m1.831s 00:07:08.375 sys 0m0.520s 00:07:08.375 12:02:37 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:08.375 12:02:37 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:08.375 ************************************ 00:07:08.375 END TEST spdkcli_tcp 00:07:08.375 ************************************ 00:07:08.375 12:02:37 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:08.375 12:02:37 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:08.375 12:02:37 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:08.375 12:02:37 -- common/autotest_common.sh@10 -- # set +x 00:07:08.375 ************************************ 00:07:08.375 START TEST dpdk_mem_utility 00:07:08.375 ************************************ 00:07:08.375 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:08.635 * Looking for test storage... 00:07:08.635 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lcov --version 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:08.635 12:02:37 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:08.635 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.635 --rc genhtml_branch_coverage=1 00:07:08.635 --rc genhtml_function_coverage=1 00:07:08.635 --rc genhtml_legend=1 00:07:08.635 --rc geninfo_all_blocks=1 00:07:08.635 --rc geninfo_unexecuted_blocks=1 00:07:08.635 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.635 ' 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:08.635 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.635 --rc genhtml_branch_coverage=1 00:07:08.635 --rc genhtml_function_coverage=1 00:07:08.635 --rc genhtml_legend=1 00:07:08.635 --rc geninfo_all_blocks=1 00:07:08.635 --rc geninfo_unexecuted_blocks=1 00:07:08.635 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.635 ' 00:07:08.635 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:08.635 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.635 --rc genhtml_branch_coverage=1 00:07:08.635 --rc genhtml_function_coverage=1 00:07:08.635 --rc genhtml_legend=1 00:07:08.635 --rc geninfo_all_blocks=1 00:07:08.635 --rc geninfo_unexecuted_blocks=1 00:07:08.635 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.635 ' 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:08.636 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.636 --rc genhtml_branch_coverage=1 00:07:08.636 --rc genhtml_function_coverage=1 00:07:08.636 --rc genhtml_legend=1 00:07:08.636 --rc geninfo_all_blocks=1 00:07:08.636 --rc geninfo_unexecuted_blocks=1 00:07:08.636 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.636 ' 00:07:08.636 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:08.636 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1712455 00:07:08.636 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1712455 00:07:08.636 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 1712455 ']' 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.636 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:08.636 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:08.636 [2024-11-27 12:02:37.410499] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:08.636 [2024-11-27 12:02:37.410576] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712455 ] 00:07:08.636 [2024-11-27 12:02:37.477762] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.636 [2024-11-27 12:02:37.516321] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.895 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:08.895 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:07:08.895 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:08.895 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:08.895 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:08.895 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:08.895 { 00:07:08.895 "filename": "/tmp/spdk_mem_dump.txt" 00:07:08.895 } 00:07:08.895 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:08.895 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:08.895 DPDK memory size 860.000000 MiB in 1 heap(s) 00:07:08.895 1 heaps totaling size 860.000000 MiB 00:07:08.895 size: 860.000000 MiB heap id: 0 00:07:08.895 end heaps---------- 00:07:08.895 9 mempools totaling size 642.649841 MiB 00:07:08.895 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:08.895 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:08.895 size: 92.545471 MiB name: bdev_io_1712455 00:07:08.896 size: 51.011292 MiB name: evtpool_1712455 00:07:08.896 size: 50.003479 MiB name: msgpool_1712455 00:07:08.896 size: 36.509338 MiB name: fsdev_io_1712455 00:07:08.896 size: 21.763794 MiB name: PDU_Pool 00:07:08.896 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:08.896 size: 0.026123 MiB name: Session_Pool 00:07:08.896 end mempools------- 00:07:08.896 6 memzones totaling size 4.142822 MiB 00:07:08.896 size: 1.000366 MiB name: RG_ring_0_1712455 00:07:08.896 size: 1.000366 MiB name: RG_ring_1_1712455 00:07:08.896 size: 1.000366 MiB name: RG_ring_4_1712455 00:07:08.896 size: 1.000366 MiB name: RG_ring_5_1712455 00:07:08.896 size: 0.125366 MiB name: RG_ring_2_1712455 00:07:08.896 size: 0.015991 MiB name: RG_ring_3_1712455 00:07:08.896 end memzones------- 00:07:08.896 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:09.155 heap id: 0 total size: 860.000000 MiB number of busy elements: 44 number of free elements: 16 00:07:09.155 list of free elements. size: 13.984680 MiB 00:07:09.155 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:09.155 element at address: 0x200000800000 with size: 1.996948 MiB 00:07:09.155 element at address: 0x20001bc00000 with size: 0.999878 MiB 00:07:09.155 element at address: 0x20001be00000 with size: 0.999878 MiB 00:07:09.155 element at address: 0x200034a00000 with size: 0.994446 MiB 00:07:09.155 element at address: 0x20000b200000 with size: 0.959839 MiB 00:07:09.155 element at address: 0x200015e00000 with size: 0.954285 MiB 00:07:09.155 element at address: 0x20001c000000 with size: 0.936584 MiB 00:07:09.155 element at address: 0x200000200000 with size: 0.841614 MiB 00:07:09.155 element at address: 0x20001d800000 with size: 0.582886 MiB 00:07:09.155 element at address: 0x200003e00000 with size: 0.495605 MiB 00:07:09.155 element at address: 0x200007000000 with size: 0.490723 MiB 00:07:09.155 element at address: 0x20001c200000 with size: 0.485657 MiB 00:07:09.155 element at address: 0x200013800000 with size: 0.481934 MiB 00:07:09.155 element at address: 0x20002ac00000 with size: 0.410034 MiB 00:07:09.155 element at address: 0x200003a00000 with size: 0.354858 MiB 00:07:09.155 list of standard malloc elements. size: 199.218628 MiB 00:07:09.155 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:09.155 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:09.155 element at address: 0x20001bcfff80 with size: 1.000122 MiB 00:07:09.155 element at address: 0x20001befff80 with size: 1.000122 MiB 00:07:09.155 element at address: 0x20001c0fff80 with size: 1.000122 MiB 00:07:09.155 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:09.155 element at address: 0x20001c0eff00 with size: 0.062622 MiB 00:07:09.155 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:09.155 element at address: 0x20001c0efdc0 with size: 0.000305 MiB 00:07:09.155 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003a5ad80 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003a5af80 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003adb300 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003adb500 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003affa80 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003affb40 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003e7ee00 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:07:09.155 element at address: 0x20000707da00 with size: 0.000183 MiB 00:07:09.155 element at address: 0x20000707dac0 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:09.155 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:07:09.155 element at address: 0x20001387b600 with size: 0.000183 MiB 00:07:09.155 element at address: 0x20001387b6c0 with size: 0.000183 MiB 00:07:09.155 element at address: 0x2000138fb980 with size: 0.000183 MiB 00:07:09.155 element at address: 0x200015ef44c0 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20001c0efc40 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20001c0efd00 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20001c2bc740 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20001d895380 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20001d895440 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20002ac68f80 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20002ac69040 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20002ac6fc40 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20002ac6fe40 with size: 0.000183 MiB 00:07:09.156 element at address: 0x20002ac6ff00 with size: 0.000183 MiB 00:07:09.156 list of memzone associated elements. size: 646.796692 MiB 00:07:09.156 element at address: 0x20001d895500 with size: 211.416748 MiB 00:07:09.156 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:09.156 element at address: 0x20002ac6ffc0 with size: 157.562561 MiB 00:07:09.156 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:09.156 element at address: 0x200015ff4780 with size: 92.045044 MiB 00:07:09.156 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_1712455_0 00:07:09.156 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:09.156 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1712455_0 00:07:09.156 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:09.156 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1712455_0 00:07:09.156 element at address: 0x2000139fdb80 with size: 36.008911 MiB 00:07:09.156 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_1712455_0 00:07:09.156 element at address: 0x20001c3be940 with size: 20.255554 MiB 00:07:09.156 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:09.156 element at address: 0x200034bfeb40 with size: 18.005066 MiB 00:07:09.156 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:09.156 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:09.156 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1712455 00:07:09.156 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:09.156 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1712455 00:07:09.156 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:09.156 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1712455 00:07:09.156 element at address: 0x2000138fba40 with size: 1.008118 MiB 00:07:09.156 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:09.156 element at address: 0x20001c2bc800 with size: 1.008118 MiB 00:07:09.156 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:09.156 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:09.156 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:09.156 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:09.156 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:09.156 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:09.156 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1712455 00:07:09.156 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:09.156 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1712455 00:07:09.156 element at address: 0x200015ef4580 with size: 1.000488 MiB 00:07:09.156 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1712455 00:07:09.156 element at address: 0x200034afe940 with size: 1.000488 MiB 00:07:09.156 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1712455 00:07:09.156 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:07:09.156 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_1712455 00:07:09.156 element at address: 0x200003e7eec0 with size: 0.500488 MiB 00:07:09.156 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1712455 00:07:09.156 element at address: 0x20001387b780 with size: 0.500488 MiB 00:07:09.156 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:09.156 element at address: 0x20000707db80 with size: 0.500488 MiB 00:07:09.156 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:09.156 element at address: 0x20001c27c540 with size: 0.250488 MiB 00:07:09.156 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:09.156 element at address: 0x200003adf880 with size: 0.125488 MiB 00:07:09.156 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1712455 00:07:09.156 element at address: 0x20000b2f5b80 with size: 0.031738 MiB 00:07:09.156 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:09.156 element at address: 0x20002ac69100 with size: 0.023743 MiB 00:07:09.156 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:09.156 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:07:09.156 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1712455 00:07:09.156 element at address: 0x20002ac6f240 with size: 0.002441 MiB 00:07:09.156 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:09.156 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:07:09.156 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1712455 00:07:09.156 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:07:09.156 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_1712455 00:07:09.156 element at address: 0x200003a5ae40 with size: 0.000305 MiB 00:07:09.156 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1712455 00:07:09.156 element at address: 0x20002ac6fd00 with size: 0.000305 MiB 00:07:09.156 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:09.156 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:09.156 12:02:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1712455 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 1712455 ']' 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 1712455 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1712455 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1712455' 00:07:09.156 killing process with pid 1712455 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 1712455 00:07:09.156 12:02:37 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 1712455 00:07:09.415 00:07:09.415 real 0m0.984s 00:07:09.415 user 0m0.891s 00:07:09.415 sys 0m0.429s 00:07:09.415 12:02:38 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:09.415 12:02:38 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:09.415 ************************************ 00:07:09.415 END TEST dpdk_mem_utility 00:07:09.415 ************************************ 00:07:09.415 12:02:38 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:09.415 12:02:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:09.415 12:02:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:09.415 12:02:38 -- common/autotest_common.sh@10 -- # set +x 00:07:09.415 ************************************ 00:07:09.415 START TEST event 00:07:09.415 ************************************ 00:07:09.415 12:02:38 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:09.675 * Looking for test storage... 00:07:09.675 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1681 -- # lcov --version 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:09.675 12:02:38 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:09.675 12:02:38 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:09.675 12:02:38 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:09.675 12:02:38 event -- scripts/common.sh@336 -- # IFS=.-: 00:07:09.675 12:02:38 event -- scripts/common.sh@336 -- # read -ra ver1 00:07:09.675 12:02:38 event -- scripts/common.sh@337 -- # IFS=.-: 00:07:09.675 12:02:38 event -- scripts/common.sh@337 -- # read -ra ver2 00:07:09.675 12:02:38 event -- scripts/common.sh@338 -- # local 'op=<' 00:07:09.675 12:02:38 event -- scripts/common.sh@340 -- # ver1_l=2 00:07:09.675 12:02:38 event -- scripts/common.sh@341 -- # ver2_l=1 00:07:09.675 12:02:38 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:09.675 12:02:38 event -- scripts/common.sh@344 -- # case "$op" in 00:07:09.675 12:02:38 event -- scripts/common.sh@345 -- # : 1 00:07:09.675 12:02:38 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:09.675 12:02:38 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:09.675 12:02:38 event -- scripts/common.sh@365 -- # decimal 1 00:07:09.675 12:02:38 event -- scripts/common.sh@353 -- # local d=1 00:07:09.675 12:02:38 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:09.675 12:02:38 event -- scripts/common.sh@355 -- # echo 1 00:07:09.675 12:02:38 event -- scripts/common.sh@365 -- # ver1[v]=1 00:07:09.675 12:02:38 event -- scripts/common.sh@366 -- # decimal 2 00:07:09.675 12:02:38 event -- scripts/common.sh@353 -- # local d=2 00:07:09.675 12:02:38 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:09.675 12:02:38 event -- scripts/common.sh@355 -- # echo 2 00:07:09.675 12:02:38 event -- scripts/common.sh@366 -- # ver2[v]=2 00:07:09.675 12:02:38 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:09.675 12:02:38 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:09.675 12:02:38 event -- scripts/common.sh@368 -- # return 0 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:09.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.675 --rc genhtml_branch_coverage=1 00:07:09.675 --rc genhtml_function_coverage=1 00:07:09.675 --rc genhtml_legend=1 00:07:09.675 --rc geninfo_all_blocks=1 00:07:09.675 --rc geninfo_unexecuted_blocks=1 00:07:09.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.675 ' 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:09.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.675 --rc genhtml_branch_coverage=1 00:07:09.675 --rc genhtml_function_coverage=1 00:07:09.675 --rc genhtml_legend=1 00:07:09.675 --rc geninfo_all_blocks=1 00:07:09.675 --rc geninfo_unexecuted_blocks=1 00:07:09.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.675 ' 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:09.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.675 --rc genhtml_branch_coverage=1 00:07:09.675 --rc genhtml_function_coverage=1 00:07:09.675 --rc genhtml_legend=1 00:07:09.675 --rc geninfo_all_blocks=1 00:07:09.675 --rc geninfo_unexecuted_blocks=1 00:07:09.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.675 ' 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:09.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.675 --rc genhtml_branch_coverage=1 00:07:09.675 --rc genhtml_function_coverage=1 00:07:09.675 --rc genhtml_legend=1 00:07:09.675 --rc geninfo_all_blocks=1 00:07:09.675 --rc geninfo_unexecuted_blocks=1 00:07:09.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.675 ' 00:07:09.675 12:02:38 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:09.675 12:02:38 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:09.675 12:02:38 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:07:09.675 12:02:38 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:09.675 12:02:38 event -- common/autotest_common.sh@10 -- # set +x 00:07:09.675 ************************************ 00:07:09.675 START TEST event_perf 00:07:09.675 ************************************ 00:07:09.675 12:02:38 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:09.675 Running I/O for 1 seconds...[2024-11-27 12:02:38.516356] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:09.675 [2024-11-27 12:02:38.516463] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712702 ] 00:07:09.934 [2024-11-27 12:02:38.596008] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:09.934 [2024-11-27 12:02:38.637402] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.934 [2024-11-27 12:02:38.637500] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:09.934 [2024-11-27 12:02:38.637587] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:09.934 [2024-11-27 12:02:38.637589] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.871 Running I/O for 1 seconds... 00:07:10.871 lcore 0: 198024 00:07:10.871 lcore 1: 198023 00:07:10.871 lcore 2: 198025 00:07:10.871 lcore 3: 198025 00:07:10.871 done. 00:07:10.871 00:07:10.871 real 0m1.196s 00:07:10.871 user 0m4.092s 00:07:10.871 sys 0m0.099s 00:07:10.871 12:02:39 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:10.871 12:02:39 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:10.871 ************************************ 00:07:10.871 END TEST event_perf 00:07:10.871 ************************************ 00:07:10.871 12:02:39 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:10.871 12:02:39 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:10.871 12:02:39 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:10.871 12:02:39 event -- common/autotest_common.sh@10 -- # set +x 00:07:11.131 ************************************ 00:07:11.131 START TEST event_reactor 00:07:11.131 ************************************ 00:07:11.131 12:02:39 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:11.131 [2024-11-27 12:02:39.798520] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:11.131 [2024-11-27 12:02:39.798619] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712993 ] 00:07:11.131 [2024-11-27 12:02:39.870563] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.131 [2024-11-27 12:02:39.912180] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.509 test_start 00:07:12.509 oneshot 00:07:12.509 tick 100 00:07:12.509 tick 100 00:07:12.509 tick 250 00:07:12.509 tick 100 00:07:12.509 tick 100 00:07:12.509 tick 100 00:07:12.509 tick 250 00:07:12.509 tick 500 00:07:12.509 tick 100 00:07:12.509 tick 100 00:07:12.509 tick 250 00:07:12.509 tick 100 00:07:12.509 tick 100 00:07:12.509 test_end 00:07:12.509 00:07:12.509 real 0m1.191s 00:07:12.509 user 0m1.100s 00:07:12.509 sys 0m0.086s 00:07:12.509 12:02:40 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:12.509 12:02:40 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:12.509 ************************************ 00:07:12.509 END TEST event_reactor 00:07:12.509 ************************************ 00:07:12.509 12:02:41 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:12.509 12:02:41 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:12.509 12:02:41 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:12.509 12:02:41 event -- common/autotest_common.sh@10 -- # set +x 00:07:12.509 ************************************ 00:07:12.509 START TEST event_reactor_perf 00:07:12.509 ************************************ 00:07:12.509 12:02:41 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:12.509 [2024-11-27 12:02:41.074908] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:12.509 [2024-11-27 12:02:41.075008] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713273 ] 00:07:12.509 [2024-11-27 12:02:41.147130] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.509 [2024-11-27 12:02:41.187826] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.445 test_start 00:07:13.445 test_end 00:07:13.445 Performance: 957518 events per second 00:07:13.445 00:07:13.445 real 0m1.185s 00:07:13.445 user 0m1.089s 00:07:13.445 sys 0m0.092s 00:07:13.445 12:02:42 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:13.445 12:02:42 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:13.445 ************************************ 00:07:13.445 END TEST event_reactor_perf 00:07:13.445 ************************************ 00:07:13.445 12:02:42 event -- event/event.sh@49 -- # uname -s 00:07:13.445 12:02:42 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:13.445 12:02:42 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:13.445 12:02:42 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:13.445 12:02:42 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:13.445 12:02:42 event -- common/autotest_common.sh@10 -- # set +x 00:07:13.445 ************************************ 00:07:13.445 START TEST event_scheduler 00:07:13.445 ************************************ 00:07:13.445 12:02:42 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:13.704 * Looking for test storage... 00:07:13.704 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1681 -- # lcov --version 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:13.704 12:02:42 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:13.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.704 --rc genhtml_branch_coverage=1 00:07:13.704 --rc genhtml_function_coverage=1 00:07:13.704 --rc genhtml_legend=1 00:07:13.704 --rc geninfo_all_blocks=1 00:07:13.704 --rc geninfo_unexecuted_blocks=1 00:07:13.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.704 ' 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:13.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.704 --rc genhtml_branch_coverage=1 00:07:13.704 --rc genhtml_function_coverage=1 00:07:13.704 --rc genhtml_legend=1 00:07:13.704 --rc geninfo_all_blocks=1 00:07:13.704 --rc geninfo_unexecuted_blocks=1 00:07:13.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.704 ' 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:13.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.704 --rc genhtml_branch_coverage=1 00:07:13.704 --rc genhtml_function_coverage=1 00:07:13.704 --rc genhtml_legend=1 00:07:13.704 --rc geninfo_all_blocks=1 00:07:13.704 --rc geninfo_unexecuted_blocks=1 00:07:13.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.704 ' 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:13.704 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.704 --rc genhtml_branch_coverage=1 00:07:13.704 --rc genhtml_function_coverage=1 00:07:13.704 --rc genhtml_legend=1 00:07:13.704 --rc geninfo_all_blocks=1 00:07:13.704 --rc geninfo_unexecuted_blocks=1 00:07:13.704 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.704 ' 00:07:13.704 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:13.704 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1713591 00:07:13.704 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:13.704 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:13.704 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1713591 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 1713591 ']' 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:13.704 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:13.704 12:02:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.704 [2024-11-27 12:02:42.537226] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:13.704 [2024-11-27 12:02:42.537296] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713591 ] 00:07:13.963 [2024-11-27 12:02:42.602182] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:13.963 [2024-11-27 12:02:42.643653] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.963 [2024-11-27 12:02:42.643735] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:13.963 [2024-11-27 12:02:42.643823] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:13.963 [2024-11-27 12:02:42.643825] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:07:13.963 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.963 [2024-11-27 12:02:42.728552] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:13.963 [2024-11-27 12:02:42.728573] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:13.963 [2024-11-27 12:02:42.728584] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:13.963 [2024-11-27 12:02:42.728592] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:13.963 [2024-11-27 12:02:42.728603] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:13.963 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.963 [2024-11-27 12:02:42.796834] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:13.963 12:02:42 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:13.963 12:02:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.963 ************************************ 00:07:13.963 START TEST scheduler_create_thread 00:07:13.963 ************************************ 00:07:13.963 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:07:13.963 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:13.963 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:13.963 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 2 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 3 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 4 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 5 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 6 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 7 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 8 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 9 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 10 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:14.222 12:02:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:15.155 12:02:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:15.155 12:02:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:15.155 12:02:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:15.155 12:02:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:16.528 12:02:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:16.528 12:02:45 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:16.528 12:02:45 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:16.528 12:02:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:16.528 12:02:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.469 12:02:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:17.469 00:07:17.469 real 0m3.383s 00:07:17.469 user 0m0.028s 00:07:17.469 sys 0m0.004s 00:07:17.469 12:02:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:17.469 12:02:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.469 ************************************ 00:07:17.469 END TEST scheduler_create_thread 00:07:17.469 ************************************ 00:07:17.469 12:02:46 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:17.469 12:02:46 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1713591 00:07:17.469 12:02:46 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 1713591 ']' 00:07:17.469 12:02:46 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 1713591 00:07:17.469 12:02:46 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1713591 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1713591' 00:07:17.470 killing process with pid 1713591 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 1713591 00:07:17.470 12:02:46 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 1713591 00:07:17.743 [2024-11-27 12:02:46.596807] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:18.029 00:07:18.029 real 0m4.493s 00:07:18.029 user 0m7.900s 00:07:18.029 sys 0m0.430s 00:07:18.029 12:02:46 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:18.029 12:02:46 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.029 ************************************ 00:07:18.029 END TEST event_scheduler 00:07:18.029 ************************************ 00:07:18.029 12:02:46 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:18.029 12:02:46 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:18.029 12:02:46 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:18.029 12:02:46 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:18.029 12:02:46 event -- common/autotest_common.sh@10 -- # set +x 00:07:18.029 ************************************ 00:07:18.029 START TEST app_repeat 00:07:18.029 ************************************ 00:07:18.029 12:02:46 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1714406 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1714406' 00:07:18.029 Process app_repeat pid: 1714406 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:18.029 spdk_app_start Round 0 00:07:18.029 12:02:46 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1714406 /var/tmp/spdk-nbd.sock 00:07:18.029 12:02:46 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1714406 ']' 00:07:18.029 12:02:46 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:18.309 12:02:46 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:18.309 12:02:46 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:18.309 12:02:46 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:18.309 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:18.309 12:02:46 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:18.309 12:02:46 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:18.309 [2024-11-27 12:02:46.929233] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:18.309 [2024-11-27 12:02:46.929317] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1714406 ] 00:07:18.309 [2024-11-27 12:02:46.999791] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:18.309 [2024-11-27 12:02:47.041105] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.309 [2024-11-27 12:02:47.041108] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.309 12:02:47 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:18.309 12:02:47 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:18.309 12:02:47 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:18.568 Malloc0 00:07:18.568 12:02:47 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:18.827 Malloc1 00:07:18.827 12:02:47 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:18.827 12:02:47 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:18.827 /dev/nbd0 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:19.089 1+0 records in 00:07:19.089 1+0 records out 00:07:19.089 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209216 s, 19.6 MB/s 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:19.089 /dev/nbd1 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:19.089 12:02:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:19.089 12:02:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:19.090 1+0 records in 00:07:19.090 1+0 records out 00:07:19.090 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263748 s, 15.5 MB/s 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.090 12:02:47 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:19.349 12:02:47 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.349 12:02:47 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:19.349 12:02:47 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:19.349 12:02:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:19.349 12:02:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:19.349 12:02:47 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:19.349 12:02:47 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.349 12:02:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:19.349 { 00:07:19.349 "nbd_device": "/dev/nbd0", 00:07:19.349 "bdev_name": "Malloc0" 00:07:19.349 }, 00:07:19.349 { 00:07:19.349 "nbd_device": "/dev/nbd1", 00:07:19.349 "bdev_name": "Malloc1" 00:07:19.349 } 00:07:19.349 ]' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:19.349 { 00:07:19.349 "nbd_device": "/dev/nbd0", 00:07:19.349 "bdev_name": "Malloc0" 00:07:19.349 }, 00:07:19.349 { 00:07:19.349 "nbd_device": "/dev/nbd1", 00:07:19.349 "bdev_name": "Malloc1" 00:07:19.349 } 00:07:19.349 ]' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:19.349 /dev/nbd1' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:19.349 /dev/nbd1' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:19.349 12:02:48 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:19.609 256+0 records in 00:07:19.609 256+0 records out 00:07:19.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102844 s, 102 MB/s 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:19.609 256+0 records in 00:07:19.609 256+0 records out 00:07:19.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198829 s, 52.7 MB/s 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:19.609 256+0 records in 00:07:19.609 256+0 records out 00:07:19.609 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0207574 s, 50.5 MB/s 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.609 12:02:48 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.867 12:02:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:20.126 12:02:48 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:20.126 12:02:48 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:20.385 12:02:49 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:20.645 [2024-11-27 12:02:49.354471] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:20.645 [2024-11-27 12:02:49.389983] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.645 [2024-11-27 12:02:49.389985] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.645 [2024-11-27 12:02:49.429834] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:20.645 [2024-11-27 12:02:49.429880] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:23.934 12:02:52 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:23.934 12:02:52 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:23.934 spdk_app_start Round 1 00:07:23.934 12:02:52 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1714406 /var/tmp/spdk-nbd.sock 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1714406 ']' 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:23.934 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:23.934 12:02:52 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:23.934 12:02:52 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.934 Malloc0 00:07:23.934 12:02:52 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.934 Malloc1 00:07:23.934 12:02:52 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.934 12:02:52 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:24.193 /dev/nbd0 00:07:24.193 12:02:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:24.193 12:02:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.193 1+0 records in 00:07:24.193 1+0 records out 00:07:24.193 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256042 s, 16.0 MB/s 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:24.193 12:02:53 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:24.193 12:02:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.193 12:02:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.193 12:02:53 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:24.452 /dev/nbd1 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.452 1+0 records in 00:07:24.452 1+0 records out 00:07:24.452 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241699 s, 16.9 MB/s 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:24.452 12:02:53 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.452 12:02:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.712 { 00:07:24.712 "nbd_device": "/dev/nbd0", 00:07:24.712 "bdev_name": "Malloc0" 00:07:24.712 }, 00:07:24.712 { 00:07:24.712 "nbd_device": "/dev/nbd1", 00:07:24.712 "bdev_name": "Malloc1" 00:07:24.712 } 00:07:24.712 ]' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.712 { 00:07:24.712 "nbd_device": "/dev/nbd0", 00:07:24.712 "bdev_name": "Malloc0" 00:07:24.712 }, 00:07:24.712 { 00:07:24.712 "nbd_device": "/dev/nbd1", 00:07:24.712 "bdev_name": "Malloc1" 00:07:24.712 } 00:07:24.712 ]' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.712 /dev/nbd1' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.712 /dev/nbd1' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:24.712 256+0 records in 00:07:24.712 256+0 records out 00:07:24.712 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108057 s, 97.0 MB/s 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.712 256+0 records in 00:07:24.712 256+0 records out 00:07:24.712 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0196791 s, 53.3 MB/s 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.712 12:02:53 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.971 256+0 records in 00:07:24.971 256+0 records out 00:07:24.971 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212721 s, 49.3 MB/s 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.971 12:02:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.230 12:02:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:25.489 12:02:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:25.490 12:02:54 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:25.490 12:02:54 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:25.749 12:02:54 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:26.009 [2024-11-27 12:02:54.680870] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:26.009 [2024-11-27 12:02:54.716471] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.009 [2024-11-27 12:02:54.716475] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.009 [2024-11-27 12:02:54.757371] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:26.009 [2024-11-27 12:02:54.757417] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:29.301 12:02:57 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:29.301 12:02:57 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:29.301 spdk_app_start Round 2 00:07:29.301 12:02:57 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1714406 /var/tmp/spdk-nbd.sock 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1714406 ']' 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:29.301 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:29.301 12:02:57 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:29.301 12:02:57 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:29.301 Malloc0 00:07:29.301 12:02:57 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:29.301 Malloc1 00:07:29.301 12:02:58 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.301 12:02:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:29.561 /dev/nbd0 00:07:29.561 12:02:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.561 12:02:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.561 1+0 records in 00:07:29.561 1+0 records out 00:07:29.561 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216967 s, 18.9 MB/s 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:29.561 12:02:58 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:29.561 12:02:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.561 12:02:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.561 12:02:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:29.820 /dev/nbd1 00:07:29.820 12:02:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:29.820 12:02:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.820 1+0 records in 00:07:29.820 1+0 records out 00:07:29.820 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000163089 s, 25.1 MB/s 00:07:29.820 12:02:58 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.821 12:02:58 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:29.821 12:02:58 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.821 12:02:58 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:29.821 12:02:58 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:29.821 12:02:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.821 12:02:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.821 12:02:58 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:29.821 12:02:58 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.821 12:02:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.079 12:02:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:30.079 { 00:07:30.079 "nbd_device": "/dev/nbd0", 00:07:30.079 "bdev_name": "Malloc0" 00:07:30.079 }, 00:07:30.079 { 00:07:30.079 "nbd_device": "/dev/nbd1", 00:07:30.079 "bdev_name": "Malloc1" 00:07:30.079 } 00:07:30.079 ]' 00:07:30.079 12:02:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:30.080 { 00:07:30.080 "nbd_device": "/dev/nbd0", 00:07:30.080 "bdev_name": "Malloc0" 00:07:30.080 }, 00:07:30.080 { 00:07:30.080 "nbd_device": "/dev/nbd1", 00:07:30.080 "bdev_name": "Malloc1" 00:07:30.080 } 00:07:30.080 ]' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:30.080 /dev/nbd1' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:30.080 /dev/nbd1' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:30.080 256+0 records in 00:07:30.080 256+0 records out 00:07:30.080 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103493 s, 101 MB/s 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:30.080 256+0 records in 00:07:30.080 256+0 records out 00:07:30.080 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019439 s, 53.9 MB/s 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:30.080 256+0 records in 00:07:30.080 256+0 records out 00:07:30.080 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021323 s, 49.2 MB/s 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.080 12:02:58 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.339 12:02:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.596 12:02:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:30.855 12:02:59 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:30.855 12:02:59 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:31.115 12:02:59 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:31.115 [2024-11-27 12:02:59.952596] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:31.115 [2024-11-27 12:02:59.987725] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.115 [2024-11-27 12:02:59.987728] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.375 [2024-11-27 12:03:00.029086] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:31.375 [2024-11-27 12:03:00.029129] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:33.912 12:03:02 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1714406 /var/tmp/spdk-nbd.sock 00:07:33.912 12:03:02 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1714406 ']' 00:07:33.912 12:03:02 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:33.912 12:03:02 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:33.912 12:03:02 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:33.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:33.912 12:03:02 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:33.912 12:03:02 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:34.171 12:03:02 event.app_repeat -- event/event.sh@39 -- # killprocess 1714406 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 1714406 ']' 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 1714406 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:34.171 12:03:02 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1714406 00:07:34.171 12:03:03 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:34.171 12:03:03 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:34.171 12:03:03 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1714406' 00:07:34.171 killing process with pid 1714406 00:07:34.171 12:03:03 event.app_repeat -- common/autotest_common.sh@969 -- # kill 1714406 00:07:34.171 12:03:03 event.app_repeat -- common/autotest_common.sh@974 -- # wait 1714406 00:07:34.430 spdk_app_start is called in Round 0. 00:07:34.431 Shutdown signal received, stop current app iteration 00:07:34.431 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 reinitialization... 00:07:34.431 spdk_app_start is called in Round 1. 00:07:34.431 Shutdown signal received, stop current app iteration 00:07:34.431 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 reinitialization... 00:07:34.431 spdk_app_start is called in Round 2. 00:07:34.431 Shutdown signal received, stop current app iteration 00:07:34.431 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 reinitialization... 00:07:34.431 spdk_app_start is called in Round 3. 00:07:34.431 Shutdown signal received, stop current app iteration 00:07:34.431 12:03:03 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:34.431 12:03:03 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:34.431 00:07:34.431 real 0m16.290s 00:07:34.431 user 0m34.919s 00:07:34.431 sys 0m3.279s 00:07:34.431 12:03:03 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:34.431 12:03:03 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:34.431 ************************************ 00:07:34.431 END TEST app_repeat 00:07:34.431 ************************************ 00:07:34.431 12:03:03 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:34.431 12:03:03 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:34.431 12:03:03 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:34.431 12:03:03 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:34.431 12:03:03 event -- common/autotest_common.sh@10 -- # set +x 00:07:34.431 ************************************ 00:07:34.431 START TEST cpu_locks 00:07:34.431 ************************************ 00:07:34.431 12:03:03 event.cpu_locks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:34.690 * Looking for test storage... 00:07:34.690 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1681 -- # lcov --version 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:34.690 12:03:03 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:34.690 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.690 --rc genhtml_branch_coverage=1 00:07:34.690 --rc genhtml_function_coverage=1 00:07:34.690 --rc genhtml_legend=1 00:07:34.690 --rc geninfo_all_blocks=1 00:07:34.690 --rc geninfo_unexecuted_blocks=1 00:07:34.690 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.690 ' 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:34.690 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.690 --rc genhtml_branch_coverage=1 00:07:34.690 --rc genhtml_function_coverage=1 00:07:34.690 --rc genhtml_legend=1 00:07:34.690 --rc geninfo_all_blocks=1 00:07:34.690 --rc geninfo_unexecuted_blocks=1 00:07:34.690 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.690 ' 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:34.690 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.690 --rc genhtml_branch_coverage=1 00:07:34.690 --rc genhtml_function_coverage=1 00:07:34.690 --rc genhtml_legend=1 00:07:34.690 --rc geninfo_all_blocks=1 00:07:34.690 --rc geninfo_unexecuted_blocks=1 00:07:34.690 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.690 ' 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:34.690 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.690 --rc genhtml_branch_coverage=1 00:07:34.690 --rc genhtml_function_coverage=1 00:07:34.690 --rc genhtml_legend=1 00:07:34.690 --rc geninfo_all_blocks=1 00:07:34.690 --rc geninfo_unexecuted_blocks=1 00:07:34.690 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.690 ' 00:07:34.690 12:03:03 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:34.690 12:03:03 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:34.690 12:03:03 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:34.690 12:03:03 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:34.690 12:03:03 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:34.690 ************************************ 00:07:34.690 START TEST default_locks 00:07:34.690 ************************************ 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # default_locks 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1717504 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 1717504 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 1717504 ']' 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.690 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:34.690 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:34.690 [2024-11-27 12:03:03.531932] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:34.690 [2024-11-27 12:03:03.532004] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1717504 ] 00:07:34.949 [2024-11-27 12:03:03.600552] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.949 [2024-11-27 12:03:03.638872] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.949 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:34.949 12:03:03 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 0 00:07:34.949 12:03:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 1717504 00:07:34.949 12:03:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 1717504 00:07:34.949 12:03:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:35.208 lslocks: write error 00:07:35.208 12:03:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 1717504 00:07:35.208 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@950 -- # '[' -z 1717504 ']' 00:07:35.208 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # kill -0 1717504 00:07:35.208 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # uname 00:07:35.208 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:35.208 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1717504 00:07:35.468 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:35.468 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:35.468 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1717504' 00:07:35.468 killing process with pid 1717504 00:07:35.468 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@969 -- # kill 1717504 00:07:35.468 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@974 -- # wait 1717504 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1717504 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@650 -- # local es=0 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1717504 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # waitforlisten 1717504 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 1717504 ']' 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:35.728 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (1717504) - No such process 00:07:35.728 ERROR: process (pid: 1717504) is no longer running 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 1 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # es=1 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:35.728 00:07:35.728 real 0m0.929s 00:07:35.728 user 0m0.880s 00:07:35.728 sys 0m0.483s 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:35.728 12:03:04 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:35.728 ************************************ 00:07:35.728 END TEST default_locks 00:07:35.728 ************************************ 00:07:35.728 12:03:04 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:35.728 12:03:04 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:35.728 12:03:04 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:35.728 12:03:04 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:35.728 ************************************ 00:07:35.728 START TEST default_locks_via_rpc 00:07:35.728 ************************************ 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # default_locks_via_rpc 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1717967 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 1717967 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1717967 ']' 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:35.728 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.728 [2024-11-27 12:03:04.523892] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:35.728 [2024-11-27 12:03:04.523936] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1717967 ] 00:07:35.728 [2024-11-27 12:03:04.588550] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.988 [2024-11-27 12:03:04.628363] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 1717967 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 1717967 00:07:35.988 12:03:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:36.556 12:03:05 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 1717967 00:07:36.556 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@950 -- # '[' -z 1717967 ']' 00:07:36.556 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # kill -0 1717967 00:07:36.556 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # uname 00:07:36.556 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:36.556 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1717967 00:07:36.815 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:36.815 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:36.815 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1717967' 00:07:36.815 killing process with pid 1717967 00:07:36.815 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@969 -- # kill 1717967 00:07:36.815 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@974 -- # wait 1717967 00:07:37.074 00:07:37.074 real 0m1.271s 00:07:37.074 user 0m1.212s 00:07:37.074 sys 0m0.625s 00:07:37.074 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:37.074 12:03:05 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:37.074 ************************************ 00:07:37.074 END TEST default_locks_via_rpc 00:07:37.074 ************************************ 00:07:37.074 12:03:05 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:37.074 12:03:05 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:37.074 12:03:05 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:37.074 12:03:05 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:37.074 ************************************ 00:07:37.074 START TEST non_locking_app_on_locked_coremask 00:07:37.074 ************************************ 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # non_locking_app_on_locked_coremask 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1718491 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 1718491 /var/tmp/spdk.sock 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1718491 ']' 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:37.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:37.074 12:03:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:37.074 [2024-11-27 12:03:05.867742] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:37.074 [2024-11-27 12:03:05.867785] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1718491 ] 00:07:37.074 [2024-11-27 12:03:05.931476] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.332 [2024-11-27 12:03:05.971851] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1718507 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 1718507 /var/tmp/spdk2.sock 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1718507 ']' 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:37.332 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:37.332 12:03:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:37.332 [2024-11-27 12:03:06.182497] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:37.332 [2024-11-27 12:03:06.182564] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1718507 ] 00:07:37.591 [2024-11-27 12:03:06.269506] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:37.591 [2024-11-27 12:03:06.269533] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.591 [2024-11-27 12:03:06.353636] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.527 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:38.527 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:38.527 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 1718491 00:07:38.527 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1718491 00:07:38.527 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:39.094 lslocks: write error 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 1718491 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1718491 ']' 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 1718491 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1718491 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1718491' 00:07:39.094 killing process with pid 1718491 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 1718491 00:07:39.094 12:03:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 1718491 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 1718507 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1718507 ']' 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 1718507 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1718507 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1718507' 00:07:40.030 killing process with pid 1718507 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 1718507 00:07:40.030 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 1718507 00:07:40.289 00:07:40.289 real 0m3.059s 00:07:40.289 user 0m3.219s 00:07:40.289 sys 0m1.110s 00:07:40.289 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:40.289 12:03:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:40.289 ************************************ 00:07:40.289 END TEST non_locking_app_on_locked_coremask 00:07:40.289 ************************************ 00:07:40.289 12:03:08 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:40.289 12:03:08 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:40.289 12:03:08 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:40.289 12:03:08 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:40.289 ************************************ 00:07:40.289 START TEST locking_app_on_unlocked_coremask 00:07:40.289 ************************************ 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_unlocked_coremask 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1719073 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 1719073 /var/tmp/spdk.sock 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1719073 ']' 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:40.289 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:40.289 12:03:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:40.289 [2024-11-27 12:03:09.015226] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:40.289 [2024-11-27 12:03:09.015283] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1719073 ] 00:07:40.289 [2024-11-27 12:03:09.081257] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:40.289 [2024-11-27 12:03:09.081282] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.289 [2024-11-27 12:03:09.120684] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1719078 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 1719078 /var/tmp/spdk2.sock 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1719078 ']' 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:40.547 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:40.547 12:03:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:40.547 [2024-11-27 12:03:09.332647] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:40.547 [2024-11-27 12:03:09.332716] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1719078 ] 00:07:40.547 [2024-11-27 12:03:09.418555] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.806 [2024-11-27 12:03:09.505417] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.374 12:03:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:41.374 12:03:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:41.374 12:03:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 1719078 00:07:41.374 12:03:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1719078 00:07:41.374 12:03:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:42.752 lslocks: write error 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 1719073 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1719073 ']' 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 1719073 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1719073 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1719073' 00:07:42.752 killing process with pid 1719073 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 1719073 00:07:42.752 12:03:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 1719073 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 1719078 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1719078 ']' 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 1719078 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1719078 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1719078' 00:07:43.320 killing process with pid 1719078 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 1719078 00:07:43.320 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 1719078 00:07:43.579 00:07:43.579 real 0m3.437s 00:07:43.579 user 0m3.625s 00:07:43.579 sys 0m1.334s 00:07:43.579 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:43.579 12:03:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:43.579 ************************************ 00:07:43.579 END TEST locking_app_on_unlocked_coremask 00:07:43.579 ************************************ 00:07:43.839 12:03:12 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:43.839 12:03:12 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:43.839 12:03:12 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:43.839 12:03:12 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:43.839 ************************************ 00:07:43.839 START TEST locking_app_on_locked_coremask 00:07:43.839 ************************************ 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_locked_coremask 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1719649 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 1719649 /var/tmp/spdk.sock 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1719649 ']' 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:43.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:43.839 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:43.839 [2024-11-27 12:03:12.520266] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:43.839 [2024-11-27 12:03:12.520320] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1719649 ] 00:07:43.839 [2024-11-27 12:03:12.583618] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.839 [2024-11-27 12:03:12.624409] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1719727 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1719727 /var/tmp/spdk2.sock 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:44.097 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1719727 /var/tmp/spdk2.sock 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # waitforlisten 1719727 /var/tmp/spdk2.sock 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1719727 ']' 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:44.098 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:44.098 12:03:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:44.098 [2024-11-27 12:03:12.843436] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:44.098 [2024-11-27 12:03:12.843528] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1719727 ] 00:07:44.098 [2024-11-27 12:03:12.939993] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1719649 has claimed it. 00:07:44.098 [2024-11-27 12:03:12.940032] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:44.665 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (1719727) - No such process 00:07:44.665 ERROR: process (pid: 1719727) is no longer running 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 1719649 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1719649 00:07:44.665 12:03:13 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:45.232 lslocks: write error 00:07:45.232 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 1719649 00:07:45.232 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1719649 ']' 00:07:45.232 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 1719649 00:07:45.232 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:45.232 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:45.232 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1719649 00:07:45.491 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:45.491 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:45.491 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1719649' 00:07:45.491 killing process with pid 1719649 00:07:45.491 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 1719649 00:07:45.491 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 1719649 00:07:45.750 00:07:45.750 real 0m1.951s 00:07:45.750 user 0m2.079s 00:07:45.750 sys 0m0.706s 00:07:45.750 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:45.750 12:03:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.750 ************************************ 00:07:45.750 END TEST locking_app_on_locked_coremask 00:07:45.750 ************************************ 00:07:45.750 12:03:14 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:45.750 12:03:14 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:45.750 12:03:14 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:45.750 12:03:14 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:45.750 ************************************ 00:07:45.750 START TEST locking_overlapped_coremask 00:07:45.750 ************************************ 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1720091 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 1720091 /var/tmp/spdk.sock 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 1720091 ']' 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.750 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:45.750 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.750 [2024-11-27 12:03:14.547636] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:45.750 [2024-11-27 12:03:14.547694] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720091 ] 00:07:45.750 [2024-11-27 12:03:14.613351] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:46.009 [2024-11-27 12:03:14.656367] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:46.009 [2024-11-27 12:03:14.656462] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:46.009 [2024-11-27 12:03:14.656462] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1720210 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1720210 /var/tmp/spdk2.sock 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1720210 /var/tmp/spdk2.sock 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # waitforlisten 1720210 /var/tmp/spdk2.sock 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 1720210 ']' 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:46.009 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:46.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:46.010 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:46.010 12:03:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:46.010 [2024-11-27 12:03:14.875673] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:46.010 [2024-11-27 12:03:14.875740] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720210 ] 00:07:46.268 [2024-11-27 12:03:14.967472] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1720091 has claimed it. 00:07:46.268 [2024-11-27 12:03:14.967512] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:46.835 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (1720210) - No such process 00:07:46.835 ERROR: process (pid: 1720210) is no longer running 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 1720091 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@950 -- # '[' -z 1720091 ']' 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # kill -0 1720091 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # uname 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1720091 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1720091' 00:07:46.835 killing process with pid 1720091 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@969 -- # kill 1720091 00:07:46.835 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@974 -- # wait 1720091 00:07:47.093 00:07:47.093 real 0m1.393s 00:07:47.093 user 0m3.870s 00:07:47.093 sys 0m0.421s 00:07:47.093 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:47.093 12:03:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:47.093 ************************************ 00:07:47.093 END TEST locking_overlapped_coremask 00:07:47.093 ************************************ 00:07:47.093 12:03:15 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:47.093 12:03:15 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:47.093 12:03:15 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:47.093 12:03:15 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:47.352 ************************************ 00:07:47.352 START TEST locking_overlapped_coremask_via_rpc 00:07:47.352 ************************************ 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask_via_rpc 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1720372 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 1720372 /var/tmp/spdk.sock 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1720372 ']' 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:47.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:47.352 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.352 [2024-11-27 12:03:16.037763] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:47.353 [2024-11-27 12:03:16.037846] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720372 ] 00:07:47.353 [2024-11-27 12:03:16.105133] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:47.353 [2024-11-27 12:03:16.105159] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:47.353 [2024-11-27 12:03:16.145873] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:47.353 [2024-11-27 12:03:16.145967] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:47.353 [2024-11-27 12:03:16.145969] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1720510 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 1720510 /var/tmp/spdk2.sock 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1720510 ']' 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:47.612 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:47.612 12:03:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.612 [2024-11-27 12:03:16.361485] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:47.612 [2024-11-27 12:03:16.361552] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720510 ] 00:07:47.612 [2024-11-27 12:03:16.453318] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:47.612 [2024-11-27 12:03:16.453351] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:47.871 [2024-11-27 12:03:16.533277] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:47.871 [2024-11-27 12:03:16.533397] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:47.871 [2024-11-27 12:03:16.533398] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@650 -- # local es=0 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.440 [2024-11-27 12:03:17.233665] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1720372 has claimed it. 00:07:48.440 request: 00:07:48.440 { 00:07:48.440 "method": "framework_enable_cpumask_locks", 00:07:48.440 "req_id": 1 00:07:48.440 } 00:07:48.440 Got JSON-RPC error response 00:07:48.440 response: 00:07:48.440 { 00:07:48.440 "code": -32603, 00:07:48.440 "message": "Failed to claim CPU core: 2" 00:07:48.440 } 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # es=1 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 1720372 /var/tmp/spdk.sock 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1720372 ']' 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:48.440 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:48.440 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 1720510 /var/tmp/spdk2.sock 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1720510 ']' 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:48.699 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:48.699 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:48.959 00:07:48.959 real 0m1.640s 00:07:48.959 user 0m0.777s 00:07:48.959 sys 0m0.169s 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:48.959 12:03:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.959 ************************************ 00:07:48.959 END TEST locking_overlapped_coremask_via_rpc 00:07:48.959 ************************************ 00:07:48.959 12:03:17 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:48.959 12:03:17 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1720372 ]] 00:07:48.959 12:03:17 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1720372 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1720372 ']' 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1720372 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1720372 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1720372' 00:07:48.959 killing process with pid 1720372 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 1720372 00:07:48.959 12:03:17 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 1720372 00:07:49.218 12:03:18 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1720510 ]] 00:07:49.218 12:03:18 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1720510 00:07:49.218 12:03:18 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1720510 ']' 00:07:49.218 12:03:18 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1720510 00:07:49.218 12:03:18 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:49.218 12:03:18 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:49.218 12:03:18 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1720510 00:07:49.477 12:03:18 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:49.477 12:03:18 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:49.477 12:03:18 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1720510' 00:07:49.477 killing process with pid 1720510 00:07:49.477 12:03:18 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 1720510 00:07:49.477 12:03:18 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 1720510 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1720372 ]] 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1720372 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1720372 ']' 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1720372 00:07:49.736 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1720372) - No such process 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 1720372 is not found' 00:07:49.736 Process with pid 1720372 is not found 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1720510 ]] 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1720510 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1720510 ']' 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1720510 00:07:49.736 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1720510) - No such process 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 1720510 is not found' 00:07:49.736 Process with pid 1720510 is not found 00:07:49.736 12:03:18 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:49.736 00:07:49.736 real 0m15.207s 00:07:49.736 user 0m25.524s 00:07:49.736 sys 0m5.924s 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:49.736 12:03:18 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:49.736 ************************************ 00:07:49.736 END TEST cpu_locks 00:07:49.736 ************************************ 00:07:49.736 00:07:49.736 real 0m40.260s 00:07:49.736 user 1m14.918s 00:07:49.736 sys 0m10.359s 00:07:49.736 12:03:18 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:49.736 12:03:18 event -- common/autotest_common.sh@10 -- # set +x 00:07:49.736 ************************************ 00:07:49.736 END TEST event 00:07:49.736 ************************************ 00:07:49.736 12:03:18 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:49.736 12:03:18 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:49.736 12:03:18 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:49.736 12:03:18 -- common/autotest_common.sh@10 -- # set +x 00:07:49.736 ************************************ 00:07:49.736 START TEST thread 00:07:49.736 ************************************ 00:07:49.736 12:03:18 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:49.994 * Looking for test storage... 00:07:49.994 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1681 -- # lcov --version 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:49.994 12:03:18 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:49.994 12:03:18 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:49.994 12:03:18 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:49.994 12:03:18 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.994 12:03:18 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:49.994 12:03:18 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:49.994 12:03:18 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:49.994 12:03:18 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:49.994 12:03:18 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:49.994 12:03:18 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:49.994 12:03:18 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:49.994 12:03:18 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:49.994 12:03:18 thread -- scripts/common.sh@345 -- # : 1 00:07:49.994 12:03:18 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:49.994 12:03:18 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.994 12:03:18 thread -- scripts/common.sh@365 -- # decimal 1 00:07:49.994 12:03:18 thread -- scripts/common.sh@353 -- # local d=1 00:07:49.994 12:03:18 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.994 12:03:18 thread -- scripts/common.sh@355 -- # echo 1 00:07:49.994 12:03:18 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:49.994 12:03:18 thread -- scripts/common.sh@366 -- # decimal 2 00:07:49.994 12:03:18 thread -- scripts/common.sh@353 -- # local d=2 00:07:49.994 12:03:18 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.994 12:03:18 thread -- scripts/common.sh@355 -- # echo 2 00:07:49.994 12:03:18 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:49.994 12:03:18 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:49.994 12:03:18 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:49.994 12:03:18 thread -- scripts/common.sh@368 -- # return 0 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:49.994 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.994 --rc genhtml_branch_coverage=1 00:07:49.994 --rc genhtml_function_coverage=1 00:07:49.994 --rc genhtml_legend=1 00:07:49.994 --rc geninfo_all_blocks=1 00:07:49.994 --rc geninfo_unexecuted_blocks=1 00:07:49.994 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.994 ' 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:49.994 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.994 --rc genhtml_branch_coverage=1 00:07:49.994 --rc genhtml_function_coverage=1 00:07:49.994 --rc genhtml_legend=1 00:07:49.994 --rc geninfo_all_blocks=1 00:07:49.994 --rc geninfo_unexecuted_blocks=1 00:07:49.994 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.994 ' 00:07:49.994 12:03:18 thread -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:49.994 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.994 --rc genhtml_branch_coverage=1 00:07:49.994 --rc genhtml_function_coverage=1 00:07:49.995 --rc genhtml_legend=1 00:07:49.995 --rc geninfo_all_blocks=1 00:07:49.995 --rc geninfo_unexecuted_blocks=1 00:07:49.995 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.995 ' 00:07:49.995 12:03:18 thread -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:49.995 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.995 --rc genhtml_branch_coverage=1 00:07:49.995 --rc genhtml_function_coverage=1 00:07:49.995 --rc genhtml_legend=1 00:07:49.995 --rc geninfo_all_blocks=1 00:07:49.995 --rc geninfo_unexecuted_blocks=1 00:07:49.995 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.995 ' 00:07:49.995 12:03:18 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:49.995 12:03:18 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:49.995 12:03:18 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:49.995 12:03:18 thread -- common/autotest_common.sh@10 -- # set +x 00:07:49.995 ************************************ 00:07:49.995 START TEST thread_poller_perf 00:07:49.995 ************************************ 00:07:49.995 12:03:18 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:49.995 [2024-11-27 12:03:18.823533] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:49.995 [2024-11-27 12:03:18.823625] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720896 ] 00:07:50.253 [2024-11-27 12:03:18.895087] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.253 [2024-11-27 12:03:18.932738] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.253 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:51.189 [2024-11-27T11:03:20.073Z] ====================================== 00:07:51.189 [2024-11-27T11:03:20.073Z] busy:2504225746 (cyc) 00:07:51.189 [2024-11-27T11:03:20.073Z] total_run_count: 842000 00:07:51.189 [2024-11-27T11:03:20.073Z] tsc_hz: 2500000000 (cyc) 00:07:51.189 [2024-11-27T11:03:20.073Z] ====================================== 00:07:51.189 [2024-11-27T11:03:20.073Z] poller_cost: 2974 (cyc), 1189 (nsec) 00:07:51.189 00:07:51.189 real 0m1.183s 00:07:51.189 user 0m1.088s 00:07:51.189 sys 0m0.091s 00:07:51.189 12:03:19 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:51.189 12:03:19 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:51.189 ************************************ 00:07:51.189 END TEST thread_poller_perf 00:07:51.189 ************************************ 00:07:51.189 12:03:20 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:51.189 12:03:20 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:51.189 12:03:20 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:51.189 12:03:20 thread -- common/autotest_common.sh@10 -- # set +x 00:07:51.189 ************************************ 00:07:51.189 START TEST thread_poller_perf 00:07:51.189 ************************************ 00:07:51.189 12:03:20 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:51.189 [2024-11-27 12:03:20.070380] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:51.189 [2024-11-27 12:03:20.070434] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1721180 ] 00:07:51.448 [2024-11-27 12:03:20.134251] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.448 [2024-11-27 12:03:20.172917] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.448 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:52.385 [2024-11-27T11:03:21.269Z] ====================================== 00:07:52.385 [2024-11-27T11:03:21.269Z] busy:2501411180 (cyc) 00:07:52.385 [2024-11-27T11:03:21.269Z] total_run_count: 12714000 00:07:52.385 [2024-11-27T11:03:21.269Z] tsc_hz: 2500000000 (cyc) 00:07:52.385 [2024-11-27T11:03:21.269Z] ====================================== 00:07:52.385 [2024-11-27T11:03:21.269Z] poller_cost: 196 (cyc), 78 (nsec) 00:07:52.385 00:07:52.385 real 0m1.166s 00:07:52.385 user 0m1.072s 00:07:52.385 sys 0m0.089s 00:07:52.385 12:03:21 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:52.385 12:03:21 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:52.385 ************************************ 00:07:52.385 END TEST thread_poller_perf 00:07:52.385 ************************************ 00:07:52.643 12:03:21 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:52.643 12:03:21 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:52.643 12:03:21 thread -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:52.643 12:03:21 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:52.643 12:03:21 thread -- common/autotest_common.sh@10 -- # set +x 00:07:52.643 ************************************ 00:07:52.643 START TEST thread_spdk_lock 00:07:52.643 ************************************ 00:07:52.643 12:03:21 thread.thread_spdk_lock -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:52.643 [2024-11-27 12:03:21.328911] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:52.643 [2024-11-27 12:03:21.329027] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1721462 ] 00:07:52.643 [2024-11-27 12:03:21.398916] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:52.643 [2024-11-27 12:03:21.437207] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:52.643 [2024-11-27 12:03:21.437209] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.211 [2024-11-27 12:03:21.918613] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 967:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:53.211 [2024-11-27 12:03:21.918651] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3080:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:53.211 [2024-11-27 12:03:21.918661] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3035:sspin_stacks_print: *ERROR*: spinlock 0x134f900 00:07:53.212 [2024-11-27 12:03:21.919415] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:53.212 [2024-11-27 12:03:21.919519] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1028:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:53.212 [2024-11-27 12:03:21.919538] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:53.212 Starting test contend 00:07:53.212 Worker Delay Wait us Hold us Total us 00:07:53.212 0 3 163399 181660 345059 00:07:53.212 1 5 79954 283507 363461 00:07:53.212 PASS test contend 00:07:53.212 Starting test hold_by_poller 00:07:53.212 PASS test hold_by_poller 00:07:53.212 Starting test hold_by_message 00:07:53.212 PASS test hold_by_message 00:07:53.212 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:53.212 100014 assertions passed 00:07:53.212 0 assertions failed 00:07:53.212 00:07:53.212 real 0m0.661s 00:07:53.212 user 0m1.052s 00:07:53.212 sys 0m0.088s 00:07:53.212 12:03:21 thread.thread_spdk_lock -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:53.212 12:03:21 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:53.212 ************************************ 00:07:53.212 END TEST thread_spdk_lock 00:07:53.212 ************************************ 00:07:53.212 00:07:53.212 real 0m3.425s 00:07:53.212 user 0m3.397s 00:07:53.212 sys 0m0.531s 00:07:53.212 12:03:22 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:53.212 12:03:22 thread -- common/autotest_common.sh@10 -- # set +x 00:07:53.212 ************************************ 00:07:53.212 END TEST thread 00:07:53.212 ************************************ 00:07:53.212 12:03:22 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:53.212 12:03:22 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:53.212 12:03:22 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:53.212 12:03:22 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:53.212 12:03:22 -- common/autotest_common.sh@10 -- # set +x 00:07:53.212 ************************************ 00:07:53.212 START TEST app_cmdline 00:07:53.212 ************************************ 00:07:53.212 12:03:22 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:53.471 * Looking for test storage... 00:07:53.471 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1681 -- # lcov --version 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:53.471 12:03:22 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:53.471 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.471 --rc genhtml_branch_coverage=1 00:07:53.471 --rc genhtml_function_coverage=1 00:07:53.471 --rc genhtml_legend=1 00:07:53.471 --rc geninfo_all_blocks=1 00:07:53.471 --rc geninfo_unexecuted_blocks=1 00:07:53.471 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.471 ' 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:53.471 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.471 --rc genhtml_branch_coverage=1 00:07:53.471 --rc genhtml_function_coverage=1 00:07:53.471 --rc genhtml_legend=1 00:07:53.471 --rc geninfo_all_blocks=1 00:07:53.471 --rc geninfo_unexecuted_blocks=1 00:07:53.471 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.471 ' 00:07:53.471 12:03:22 app_cmdline -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:53.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.472 --rc genhtml_branch_coverage=1 00:07:53.472 --rc genhtml_function_coverage=1 00:07:53.472 --rc genhtml_legend=1 00:07:53.472 --rc geninfo_all_blocks=1 00:07:53.472 --rc geninfo_unexecuted_blocks=1 00:07:53.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.472 ' 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:53.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:53.472 --rc genhtml_branch_coverage=1 00:07:53.472 --rc genhtml_function_coverage=1 00:07:53.472 --rc genhtml_legend=1 00:07:53.472 --rc geninfo_all_blocks=1 00:07:53.472 --rc geninfo_unexecuted_blocks=1 00:07:53.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:53.472 ' 00:07:53.472 12:03:22 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:53.472 12:03:22 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1721679 00:07:53.472 12:03:22 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1721679 00:07:53.472 12:03:22 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 1721679 ']' 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:53.472 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:53.472 12:03:22 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:53.472 [2024-11-27 12:03:22.281964] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:53.472 [2024-11-27 12:03:22.282033] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1721679 ] 00:07:53.472 [2024-11-27 12:03:22.348848] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.731 [2024-11-27 12:03:22.388868] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.731 12:03:22 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:53.731 12:03:22 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:07:53.731 12:03:22 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:53.990 { 00:07:53.990 "version": "SPDK v24.09.1-pre git sha1 b18e1bd62", 00:07:53.990 "fields": { 00:07:53.990 "major": 24, 00:07:53.990 "minor": 9, 00:07:53.990 "patch": 1, 00:07:53.990 "suffix": "-pre", 00:07:53.990 "commit": "b18e1bd62" 00:07:53.990 } 00:07:53.990 } 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:53.990 12:03:22 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:53.990 12:03:22 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:54.250 request: 00:07:54.250 { 00:07:54.250 "method": "env_dpdk_get_mem_stats", 00:07:54.250 "req_id": 1 00:07:54.250 } 00:07:54.250 Got JSON-RPC error response 00:07:54.250 response: 00:07:54.250 { 00:07:54.250 "code": -32601, 00:07:54.250 "message": "Method not found" 00:07:54.250 } 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:54.250 12:03:22 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1721679 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 1721679 ']' 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 1721679 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:07:54.250 12:03:22 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:54.250 12:03:23 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1721679 00:07:54.250 12:03:23 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:54.250 12:03:23 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:54.250 12:03:23 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1721679' 00:07:54.250 killing process with pid 1721679 00:07:54.250 12:03:23 app_cmdline -- common/autotest_common.sh@969 -- # kill 1721679 00:07:54.250 12:03:23 app_cmdline -- common/autotest_common.sh@974 -- # wait 1721679 00:07:54.509 00:07:54.509 real 0m1.275s 00:07:54.509 user 0m1.446s 00:07:54.509 sys 0m0.491s 00:07:54.509 12:03:23 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:54.509 12:03:23 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:54.509 ************************************ 00:07:54.509 END TEST app_cmdline 00:07:54.509 ************************************ 00:07:54.784 12:03:23 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:54.784 12:03:23 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:54.784 12:03:23 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:54.784 12:03:23 -- common/autotest_common.sh@10 -- # set +x 00:07:54.784 ************************************ 00:07:54.784 START TEST version 00:07:54.784 ************************************ 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:54.784 * Looking for test storage... 00:07:54.784 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1681 -- # lcov --version 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:54.784 12:03:23 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:54.784 12:03:23 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:54.784 12:03:23 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:54.784 12:03:23 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:54.784 12:03:23 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:54.784 12:03:23 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:54.784 12:03:23 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:54.784 12:03:23 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:54.784 12:03:23 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:54.784 12:03:23 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:54.784 12:03:23 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:54.784 12:03:23 version -- scripts/common.sh@344 -- # case "$op" in 00:07:54.784 12:03:23 version -- scripts/common.sh@345 -- # : 1 00:07:54.784 12:03:23 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:54.784 12:03:23 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:54.784 12:03:23 version -- scripts/common.sh@365 -- # decimal 1 00:07:54.784 12:03:23 version -- scripts/common.sh@353 -- # local d=1 00:07:54.784 12:03:23 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:54.784 12:03:23 version -- scripts/common.sh@355 -- # echo 1 00:07:54.784 12:03:23 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:54.784 12:03:23 version -- scripts/common.sh@366 -- # decimal 2 00:07:54.784 12:03:23 version -- scripts/common.sh@353 -- # local d=2 00:07:54.784 12:03:23 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:54.784 12:03:23 version -- scripts/common.sh@355 -- # echo 2 00:07:54.784 12:03:23 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:54.784 12:03:23 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:54.784 12:03:23 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:54.784 12:03:23 version -- scripts/common.sh@368 -- # return 0 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:54.784 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.784 --rc genhtml_branch_coverage=1 00:07:54.784 --rc genhtml_function_coverage=1 00:07:54.784 --rc genhtml_legend=1 00:07:54.784 --rc geninfo_all_blocks=1 00:07:54.784 --rc geninfo_unexecuted_blocks=1 00:07:54.784 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.784 ' 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:54.784 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.784 --rc genhtml_branch_coverage=1 00:07:54.784 --rc genhtml_function_coverage=1 00:07:54.784 --rc genhtml_legend=1 00:07:54.784 --rc geninfo_all_blocks=1 00:07:54.784 --rc geninfo_unexecuted_blocks=1 00:07:54.784 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.784 ' 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:54.784 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.784 --rc genhtml_branch_coverage=1 00:07:54.784 --rc genhtml_function_coverage=1 00:07:54.784 --rc genhtml_legend=1 00:07:54.784 --rc geninfo_all_blocks=1 00:07:54.784 --rc geninfo_unexecuted_blocks=1 00:07:54.784 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.784 ' 00:07:54.784 12:03:23 version -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:54.784 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.784 --rc genhtml_branch_coverage=1 00:07:54.784 --rc genhtml_function_coverage=1 00:07:54.784 --rc genhtml_legend=1 00:07:54.784 --rc geninfo_all_blocks=1 00:07:54.784 --rc geninfo_unexecuted_blocks=1 00:07:54.784 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.784 ' 00:07:54.784 12:03:23 version -- app/version.sh@17 -- # get_header_version major 00:07:54.784 12:03:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:54.784 12:03:23 version -- app/version.sh@14 -- # cut -f2 00:07:54.784 12:03:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:54.784 12:03:23 version -- app/version.sh@17 -- # major=24 00:07:54.785 12:03:23 version -- app/version.sh@18 -- # get_header_version minor 00:07:54.785 12:03:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:54.785 12:03:23 version -- app/version.sh@14 -- # cut -f2 00:07:54.785 12:03:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:55.062 12:03:23 version -- app/version.sh@18 -- # minor=9 00:07:55.062 12:03:23 version -- app/version.sh@19 -- # get_header_version patch 00:07:55.062 12:03:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:55.062 12:03:23 version -- app/version.sh@14 -- # cut -f2 00:07:55.062 12:03:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:55.062 12:03:23 version -- app/version.sh@19 -- # patch=1 00:07:55.063 12:03:23 version -- app/version.sh@20 -- # get_header_version suffix 00:07:55.063 12:03:23 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:55.063 12:03:23 version -- app/version.sh@14 -- # cut -f2 00:07:55.063 12:03:23 version -- app/version.sh@14 -- # tr -d '"' 00:07:55.063 12:03:23 version -- app/version.sh@20 -- # suffix=-pre 00:07:55.063 12:03:23 version -- app/version.sh@22 -- # version=24.9 00:07:55.063 12:03:23 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:55.063 12:03:23 version -- app/version.sh@25 -- # version=24.9.1 00:07:55.063 12:03:23 version -- app/version.sh@28 -- # version=24.9.1rc0 00:07:55.063 12:03:23 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:55.063 12:03:23 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:55.063 12:03:23 version -- app/version.sh@30 -- # py_version=24.9.1rc0 00:07:55.063 12:03:23 version -- app/version.sh@31 -- # [[ 24.9.1rc0 == \2\4\.\9\.\1\r\c\0 ]] 00:07:55.063 00:07:55.063 real 0m0.281s 00:07:55.063 user 0m0.160s 00:07:55.063 sys 0m0.176s 00:07:55.063 12:03:23 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:55.063 12:03:23 version -- common/autotest_common.sh@10 -- # set +x 00:07:55.063 ************************************ 00:07:55.063 END TEST version 00:07:55.063 ************************************ 00:07:55.063 12:03:23 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@194 -- # uname -s 00:07:55.063 12:03:23 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@256 -- # timing_exit lib 00:07:55.063 12:03:23 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:55.063 12:03:23 -- common/autotest_common.sh@10 -- # set +x 00:07:55.063 12:03:23 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@263 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@272 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@307 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@351 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:55.063 12:03:23 -- spdk/autotest.sh@362 -- # [[ 0 -eq 1 ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@370 -- # [[ 1 -eq 1 ]] 00:07:55.063 12:03:23 -- spdk/autotest.sh@371 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:55.063 12:03:23 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:55.063 12:03:23 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:55.063 12:03:23 -- common/autotest_common.sh@10 -- # set +x 00:07:55.063 ************************************ 00:07:55.063 START TEST llvm_fuzz 00:07:55.063 ************************************ 00:07:55.063 12:03:23 llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:55.370 * Looking for test storage... 00:07:55.370 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:55.370 12:03:23 llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:55.370 12:03:23 llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:55.370 12:03:23 llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:55.370 12:03:24 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:55.370 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.370 --rc genhtml_branch_coverage=1 00:07:55.370 --rc genhtml_function_coverage=1 00:07:55.370 --rc genhtml_legend=1 00:07:55.370 --rc geninfo_all_blocks=1 00:07:55.370 --rc geninfo_unexecuted_blocks=1 00:07:55.370 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.370 ' 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:55.370 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.370 --rc genhtml_branch_coverage=1 00:07:55.370 --rc genhtml_function_coverage=1 00:07:55.370 --rc genhtml_legend=1 00:07:55.370 --rc geninfo_all_blocks=1 00:07:55.370 --rc geninfo_unexecuted_blocks=1 00:07:55.370 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.370 ' 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:55.370 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.370 --rc genhtml_branch_coverage=1 00:07:55.370 --rc genhtml_function_coverage=1 00:07:55.370 --rc genhtml_legend=1 00:07:55.370 --rc geninfo_all_blocks=1 00:07:55.370 --rc geninfo_unexecuted_blocks=1 00:07:55.370 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.370 ' 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:55.370 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.370 --rc genhtml_branch_coverage=1 00:07:55.370 --rc genhtml_function_coverage=1 00:07:55.370 --rc genhtml_legend=1 00:07:55.370 --rc geninfo_all_blocks=1 00:07:55.370 --rc geninfo_unexecuted_blocks=1 00:07:55.370 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.370 ' 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@548 -- # local fuzzers 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:55.370 12:03:24 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:55.370 12:03:24 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:55.370 ************************************ 00:07:55.370 START TEST nvmf_llvm_fuzz 00:07:55.370 ************************************ 00:07:55.370 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:55.370 * Looking for test storage... 00:07:55.370 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.370 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:55.370 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:55.370 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:55.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.632 --rc genhtml_branch_coverage=1 00:07:55.632 --rc genhtml_function_coverage=1 00:07:55.632 --rc genhtml_legend=1 00:07:55.632 --rc geninfo_all_blocks=1 00:07:55.632 --rc geninfo_unexecuted_blocks=1 00:07:55.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.632 ' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:55.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.632 --rc genhtml_branch_coverage=1 00:07:55.632 --rc genhtml_function_coverage=1 00:07:55.632 --rc genhtml_legend=1 00:07:55.632 --rc geninfo_all_blocks=1 00:07:55.632 --rc geninfo_unexecuted_blocks=1 00:07:55.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.632 ' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:55.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.632 --rc genhtml_branch_coverage=1 00:07:55.632 --rc genhtml_function_coverage=1 00:07:55.632 --rc genhtml_legend=1 00:07:55.632 --rc geninfo_all_blocks=1 00:07:55.632 --rc geninfo_unexecuted_blocks=1 00:07:55.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.632 ' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:55.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.632 --rc genhtml_branch_coverage=1 00:07:55.632 --rc genhtml_function_coverage=1 00:07:55.632 --rc genhtml_legend=1 00:07:55.632 --rc geninfo_all_blocks=1 00:07:55.632 --rc geninfo_unexecuted_blocks=1 00:07:55.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.632 ' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:55.632 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:55.633 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:55.633 #define SPDK_CONFIG_H 00:07:55.633 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:55.633 #define SPDK_CONFIG_APPS 1 00:07:55.633 #define SPDK_CONFIG_ARCH native 00:07:55.633 #undef SPDK_CONFIG_ASAN 00:07:55.633 #undef SPDK_CONFIG_AVAHI 00:07:55.633 #undef SPDK_CONFIG_CET 00:07:55.633 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:55.633 #define SPDK_CONFIG_COVERAGE 1 00:07:55.633 #define SPDK_CONFIG_CROSS_PREFIX 00:07:55.633 #undef SPDK_CONFIG_CRYPTO 00:07:55.633 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:55.633 #undef SPDK_CONFIG_CUSTOMOCF 00:07:55.633 #undef SPDK_CONFIG_DAOS 00:07:55.633 #define SPDK_CONFIG_DAOS_DIR 00:07:55.633 #define SPDK_CONFIG_DEBUG 1 00:07:55.633 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:55.633 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:55.633 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:55.633 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.633 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:55.633 #undef SPDK_CONFIG_DPDK_UADK 00:07:55.633 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:55.633 #define SPDK_CONFIG_EXAMPLES 1 00:07:55.633 #undef SPDK_CONFIG_FC 00:07:55.633 #define SPDK_CONFIG_FC_PATH 00:07:55.633 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:55.633 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:55.633 #define SPDK_CONFIG_FSDEV 1 00:07:55.633 #undef SPDK_CONFIG_FUSE 00:07:55.633 #define SPDK_CONFIG_FUZZER 1 00:07:55.633 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:55.633 #undef SPDK_CONFIG_GOLANG 00:07:55.633 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:55.633 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:55.633 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:55.633 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:55.633 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:55.633 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:55.633 #undef SPDK_CONFIG_HAVE_LZ4 00:07:55.633 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:55.633 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:55.633 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:55.633 #define SPDK_CONFIG_IDXD 1 00:07:55.633 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:55.633 #undef SPDK_CONFIG_IPSEC_MB 00:07:55.633 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:55.633 #define SPDK_CONFIG_ISAL 1 00:07:55.633 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:55.633 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:55.633 #define SPDK_CONFIG_LIBDIR 00:07:55.633 #undef SPDK_CONFIG_LTO 00:07:55.633 #define SPDK_CONFIG_MAX_LCORES 128 00:07:55.633 #define SPDK_CONFIG_NVME_CUSE 1 00:07:55.633 #undef SPDK_CONFIG_OCF 00:07:55.633 #define SPDK_CONFIG_OCF_PATH 00:07:55.633 #define SPDK_CONFIG_OPENSSL_PATH 00:07:55.633 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:55.634 #define SPDK_CONFIG_PGO_DIR 00:07:55.634 #undef SPDK_CONFIG_PGO_USE 00:07:55.634 #define SPDK_CONFIG_PREFIX /usr/local 00:07:55.634 #undef SPDK_CONFIG_RAID5F 00:07:55.634 #undef SPDK_CONFIG_RBD 00:07:55.634 #define SPDK_CONFIG_RDMA 1 00:07:55.634 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:55.634 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:55.634 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:55.634 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:55.634 #undef SPDK_CONFIG_SHARED 00:07:55.634 #undef SPDK_CONFIG_SMA 00:07:55.634 #define SPDK_CONFIG_TESTS 1 00:07:55.634 #undef SPDK_CONFIG_TSAN 00:07:55.634 #define SPDK_CONFIG_UBLK 1 00:07:55.634 #define SPDK_CONFIG_UBSAN 1 00:07:55.634 #undef SPDK_CONFIG_UNIT_TESTS 00:07:55.634 #undef SPDK_CONFIG_URING 00:07:55.634 #define SPDK_CONFIG_URING_PATH 00:07:55.634 #undef SPDK_CONFIG_URING_ZNS 00:07:55.634 #undef SPDK_CONFIG_USDT 00:07:55.634 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:55.634 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:55.634 #define SPDK_CONFIG_VFIO_USER 1 00:07:55.634 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:55.634 #define SPDK_CONFIG_VHOST 1 00:07:55.634 #define SPDK_CONFIG_VIRTIO 1 00:07:55.634 #undef SPDK_CONFIG_VTUNE 00:07:55.634 #define SPDK_CONFIG_VTUNE_DIR 00:07:55.634 #define SPDK_CONFIG_WERROR 1 00:07:55.634 #define SPDK_CONFIG_WPDK_DIR 00:07:55.634 #undef SPDK_CONFIG_XNVME 00:07:55.634 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:55.634 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:55.635 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 1722239 ]] 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 1722239 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.demf3d 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.demf3d/tests/nvmf /tmp/spdk.demf3d 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=51253981184 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730607104 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=10476625920 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30860537856 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865301504 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4763648 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340129792 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5992448 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30863654912 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=1650688 00:07:55.636 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:07:55.637 * Looking for test storage... 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=51253981184 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=12691218432 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.637 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:55.637 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.637 --rc genhtml_branch_coverage=1 00:07:55.637 --rc genhtml_function_coverage=1 00:07:55.637 --rc genhtml_legend=1 00:07:55.637 --rc geninfo_all_blocks=1 00:07:55.637 --rc geninfo_unexecuted_blocks=1 00:07:55.637 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.637 ' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:55.637 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.637 --rc genhtml_branch_coverage=1 00:07:55.637 --rc genhtml_function_coverage=1 00:07:55.637 --rc genhtml_legend=1 00:07:55.637 --rc geninfo_all_blocks=1 00:07:55.637 --rc geninfo_unexecuted_blocks=1 00:07:55.637 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.637 ' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:55.637 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.637 --rc genhtml_branch_coverage=1 00:07:55.637 --rc genhtml_function_coverage=1 00:07:55.637 --rc genhtml_legend=1 00:07:55.637 --rc geninfo_all_blocks=1 00:07:55.637 --rc geninfo_unexecuted_blocks=1 00:07:55.637 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.637 ' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:55.637 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:55.637 --rc genhtml_branch_coverage=1 00:07:55.637 --rc genhtml_function_coverage=1 00:07:55.637 --rc genhtml_legend=1 00:07:55.637 --rc geninfo_all_blocks=1 00:07:55.637 --rc geninfo_unexecuted_blocks=1 00:07:55.637 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:55.637 ' 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.637 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.638 12:03:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:55.897 [2024-11-27 12:03:24.522116] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:55.897 [2024-11-27 12:03:24.522185] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1722297 ] 00:07:55.897 [2024-11-27 12:03:24.695952] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.897 [2024-11-27 12:03:24.717745] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.897 [2024-11-27 12:03:24.769984] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.267 [2024-11-27 12:03:24.786363] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:56.267 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.267 INFO: Seed: 2847357387 00:07:56.267 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:56.267 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:56.267 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:56.267 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.267 #2 INITED exec/s: 0 rss: 65Mb 00:07:56.267 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.267 This may also happen if the target rejected all inputs we tried so far 00:07:56.267 [2024-11-27 12:03:24.841670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.267 [2024-11-27 12:03:24.841700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.526 NEW_FUNC[1/713]: 0x459648 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:56.526 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.526 #11 NEW cov: 12150 ft: 12131 corp: 2/94b lim: 320 exec/s: 0 rss: 72Mb L: 93/93 MS: 4 CopyPart-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:56.526 [2024-11-27 12:03:25.172601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.526 [2024-11-27 12:03:25.172636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.526 NEW_FUNC[1/1]: 0x193a8a8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1539 00:07:56.526 #12 NEW cov: 12265 ft: 12616 corp: 3/214b lim: 320 exec/s: 0 rss: 72Mb L: 120/120 MS: 1 CopyPart- 00:07:56.526 [2024-11-27 12:03:25.232653] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.526 [2024-11-27 12:03:25.232680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.526 #13 NEW cov: 12271 ft: 12875 corp: 4/334b lim: 320 exec/s: 0 rss: 72Mb L: 120/120 MS: 1 ChangeBit- 00:07:56.526 [2024-11-27 12:03:25.292787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:4 nsid:98989898 cdw10:98989898 cdw11:98989898 00:07:56.526 [2024-11-27 12:03:25.292812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.526 #15 NEW cov: 12379 ft: 13289 corp: 5/447b lim: 320 exec/s: 0 rss: 72Mb L: 113/120 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:56.526 [2024-11-27 12:03:25.332892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.526 [2024-11-27 12:03:25.332918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.526 #16 NEW cov: 12379 ft: 13399 corp: 6/567b lim: 320 exec/s: 0 rss: 72Mb L: 120/120 MS: 1 ChangeBinInt- 00:07:56.526 [2024-11-27 12:03:25.372993] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.526 [2024-11-27 12:03:25.373020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.526 #17 NEW cov: 12379 ft: 13497 corp: 7/646b lim: 320 exec/s: 0 rss: 72Mb L: 79/120 MS: 1 EraseBytes- 00:07:56.785 [2024-11-27 12:03:25.413118] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.413146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.785 #18 NEW cov: 12379 ft: 13587 corp: 8/768b lim: 320 exec/s: 0 rss: 72Mb L: 122/122 MS: 1 InsertRepeatedBytes- 00:07:56.785 [2024-11-27 12:03:25.473381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.473407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.785 [2024-11-27 12:03:25.473467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:56.785 [2024-11-27 12:03:25.473480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.785 NEW_FUNC[1/1]: 0x1512608 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:07:56.785 #19 NEW cov: 12410 ft: 13810 corp: 9/916b lim: 320 exec/s: 0 rss: 72Mb L: 148/148 MS: 1 InsertRepeatedBytes- 00:07:56.785 [2024-11-27 12:03:25.513378] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.513408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.785 #20 NEW cov: 12410 ft: 13853 corp: 10/1002b lim: 320 exec/s: 0 rss: 72Mb L: 86/148 MS: 1 EraseBytes- 00:07:56.785 [2024-11-27 12:03:25.553593] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.553624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.785 [2024-11-27 12:03:25.553682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.553696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.785 #21 NEW cov: 12410 ft: 13886 corp: 11/1152b lim: 320 exec/s: 0 rss: 72Mb L: 150/150 MS: 1 InsertRepeatedBytes- 00:07:56.785 [2024-11-27 12:03:25.593608] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.593634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.785 #27 NEW cov: 12410 ft: 13901 corp: 12/1272b lim: 320 exec/s: 0 rss: 73Mb L: 120/150 MS: 1 ChangeByte- 00:07:56.785 [2024-11-27 12:03:25.653754] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:56.785 [2024-11-27 12:03:25.653780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.043 #28 NEW cov: 12410 ft: 13915 corp: 13/1392b lim: 320 exec/s: 0 rss: 73Mb L: 120/150 MS: 1 ShuffleBytes- 00:07:57.043 [2024-11-27 12:03:25.713928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.043 [2024-11-27 12:03:25.713954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.043 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:57.043 #29 NEW cov: 12433 ft: 13955 corp: 14/1471b lim: 320 exec/s: 0 rss: 73Mb L: 79/150 MS: 1 ChangeByte- 00:07:57.043 [2024-11-27 12:03:25.754004] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:98989898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9898989898ffffff 00:07:57.043 [2024-11-27 12:03:25.754029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.043 #30 NEW cov: 12433 ft: 14001 corp: 15/1591b lim: 320 exec/s: 0 rss: 73Mb L: 120/150 MS: 1 CrossOver- 00:07:57.043 [2024-11-27 12:03:25.794300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.043 [2024-11-27 12:03:25.794326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.043 [2024-11-27 12:03:25.794386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:989898ff cdw10:98989898 cdw11:98989898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9898989898989898 00:07:57.043 [2024-11-27 12:03:25.794400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.043 [2024-11-27 12:03:25.794455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:98989898 cdw11:98989898 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.043 [2024-11-27 12:03:25.794472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.043 #31 NEW cov: 12433 ft: 14288 corp: 16/1824b lim: 320 exec/s: 0 rss: 73Mb L: 233/233 MS: 1 CrossOver- 00:07:57.043 [2024-11-27 12:03:25.834498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.043 [2024-11-27 12:03:25.834524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.043 [2024-11-27 12:03:25.834584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (39) qid:0 cid:5 nsid:39393939 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.043 [2024-11-27 12:03:25.834601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.043 [2024-11-27 12:03:25.834662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.043 [2024-11-27 12:03:25.834676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.043 NEW_FUNC[1/1]: 0x194dc38 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:57.043 #32 NEW cov: 12446 ft: 14729 corp: 17/2041b lim: 320 exec/s: 32 rss: 73Mb L: 217/233 MS: 1 InsertRepeatedBytes- 00:07:57.043 [2024-11-27 12:03:25.894441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.043 [2024-11-27 12:03:25.894467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.302 #33 NEW cov: 12446 ft: 14784 corp: 18/2139b lim: 320 exec/s: 33 rss: 73Mb L: 98/233 MS: 1 EraseBytes- 00:07:57.302 [2024-11-27 12:03:25.954589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.302 [2024-11-27 12:03:25.954619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.302 #34 NEW cov: 12446 ft: 14792 corp: 19/2218b lim: 320 exec/s: 34 rss: 73Mb L: 79/233 MS: 1 ShuffleBytes- 00:07:57.302 [2024-11-27 12:03:25.994703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.302 [2024-11-27 12:03:25.994728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.302 #35 NEW cov: 12446 ft: 14803 corp: 20/2335b lim: 320 exec/s: 35 rss: 73Mb L: 117/233 MS: 1 EraseBytes- 00:07:57.302 [2024-11-27 12:03:26.054882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.302 [2024-11-27 12:03:26.054908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.302 #36 NEW cov: 12446 ft: 14856 corp: 21/2455b lim: 320 exec/s: 36 rss: 73Mb L: 120/233 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:57.302 [2024-11-27 12:03:26.115257] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.302 [2024-11-27 12:03:26.115284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.302 [2024-11-27 12:03:26.115344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:989898ff cdw10:98989898 cdw11:98989898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9898989898989898 00:07:57.302 [2024-11-27 12:03:26.115358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.302 [2024-11-27 12:03:26.115425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:98989898 cdw11:98989898 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.302 [2024-11-27 12:03:26.115438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.302 #37 NEW cov: 12446 ft: 14877 corp: 22/2689b lim: 320 exec/s: 37 rss: 73Mb L: 234/234 MS: 1 InsertByte- 00:07:57.302 [2024-11-27 12:03:26.155128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:4 nsid:98989898 cdw10:98989898 cdw11:98989898 00:07:57.302 [2024-11-27 12:03:26.155153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.560 #38 NEW cov: 12446 ft: 14936 corp: 23/2802b lim: 320 exec/s: 38 rss: 73Mb L: 113/234 MS: 1 ChangeByte- 00:07:57.560 [2024-11-27 12:03:26.215329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.561 [2024-11-27 12:03:26.215355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.561 #39 NEW cov: 12446 ft: 14947 corp: 24/2901b lim: 320 exec/s: 39 rss: 73Mb L: 99/234 MS: 1 InsertByte- 00:07:57.561 [2024-11-27 12:03:26.275625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.561 [2024-11-27 12:03:26.275652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.561 [2024-11-27 12:03:26.275707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:5 nsid:98989898 cdw10:9898ff98 cdw11:ff989898 00:07:57.561 [2024-11-27 12:03:26.275720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.561 #40 NEW cov: 12446 ft: 14963 corp: 25/3071b lim: 320 exec/s: 40 rss: 73Mb L: 170/234 MS: 1 CrossOver- 00:07:57.561 [2024-11-27 12:03:26.335769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.561 [2024-11-27 12:03:26.335795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.561 [2024-11-27 12:03:26.335854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:57.561 [2024-11-27 12:03:26.335867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.561 #41 NEW cov: 12446 ft: 14979 corp: 26/3219b lim: 320 exec/s: 41 rss: 74Mb L: 148/234 MS: 1 ShuffleBytes- 00:07:57.561 [2024-11-27 12:03:26.395929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.561 [2024-11-27 12:03:26.395955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.561 [2024-11-27 12:03:26.396010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:5 nsid:98989898 cdw10:9898ff98 cdw11:ff989898 00:07:57.561 [2024-11-27 12:03:26.396024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.561 #47 NEW cov: 12446 ft: 15008 corp: 27/3389b lim: 320 exec/s: 47 rss: 74Mb L: 170/234 MS: 1 ChangeByte- 00:07:57.820 [2024-11-27 12:03:26.456011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.820 [2024-11-27 12:03:26.456036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.820 #48 NEW cov: 12446 ft: 15045 corp: 28/3468b lim: 320 exec/s: 48 rss: 74Mb L: 79/234 MS: 1 ShuffleBytes- 00:07:57.820 [2024-11-27 12:03:26.516210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (8a) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.820 [2024-11-27 12:03:26.516236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.820 #49 NEW cov: 12446 ft: 15066 corp: 29/3547b lim: 320 exec/s: 49 rss: 74Mb L: 79/234 MS: 1 ChangeBit- 00:07:57.820 [2024-11-27 12:03:26.576493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.820 [2024-11-27 12:03:26.576519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.820 [2024-11-27 12:03:26.576579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:81818181 cdw10:81818181 cdw11:81818181 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.820 [2024-11-27 12:03:26.576593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.820 #50 NEW cov: 12446 ft: 15074 corp: 30/3732b lim: 320 exec/s: 50 rss: 74Mb L: 185/234 MS: 1 InsertRepeatedBytes- 00:07:57.820 [2024-11-27 12:03:26.636629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.820 [2024-11-27 12:03:26.636656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.820 [2024-11-27 12:03:26.636716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:57.820 [2024-11-27 12:03:26.636730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.820 #51 NEW cov: 12446 ft: 15088 corp: 31/3880b lim: 320 exec/s: 51 rss: 74Mb L: 148/234 MS: 1 ChangeBit- 00:07:57.820 [2024-11-27 12:03:26.676770] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:57.820 [2024-11-27 12:03:26.676796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.820 [2024-11-27 12:03:26.676854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:5 nsid:98989898 cdw10:9898ff98 cdw11:ff989898 00:07:57.820 [2024-11-27 12:03:26.676868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.820 #52 NEW cov: 12446 ft: 15106 corp: 32/4050b lim: 320 exec/s: 52 rss: 74Mb L: 170/234 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:07:58.079 [2024-11-27 12:03:26.716909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:58.079 [2024-11-27 12:03:26.716934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.079 [2024-11-27 12:03:26.716992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:5 nsid:98989898 cdw10:98ff9898 cdw11:98989898 00:07:58.079 [2024-11-27 12:03:26.717006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.079 #53 NEW cov: 12446 ft: 15198 corp: 33/4221b lim: 320 exec/s: 53 rss: 74Mb L: 171/234 MS: 1 InsertByte- 00:07:58.079 [2024-11-27 12:03:26.756971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:58.079 [2024-11-27 12:03:26.756996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.079 [2024-11-27 12:03:26.757057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:58.079 [2024-11-27 12:03:26.757074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.079 #54 NEW cov: 12446 ft: 15237 corp: 34/4409b lim: 320 exec/s: 54 rss: 74Mb L: 188/234 MS: 1 InsertRepeatedBytes- 00:07:58.079 [2024-11-27 12:03:26.797137] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:58.079 [2024-11-27 12:03:26.797162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.079 [2024-11-27 12:03:26.797218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (98) qid:0 cid:5 nsid:98989898 cdw10:ffffffff cdw11:ffffffff 00:07:58.079 [2024-11-27 12:03:26.797231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.079 #55 NEW cov: 12446 ft: 15289 corp: 35/4580b lim: 320 exec/s: 27 rss: 74Mb L: 171/234 MS: 1 CopyPart- 00:07:58.079 #55 DONE cov: 12446 ft: 15289 corp: 35/4580b lim: 320 exec/s: 27 rss: 74Mb 00:07:58.079 ###### Recommended dictionary. ###### 00:07:58.079 "\377\377\377\377" # Uses: 0 00:07:58.079 "\017\000\000\000\000\000\000\000" # Uses: 0 00:07:58.079 ###### End of recommended dictionary. ###### 00:07:58.079 Done 55 runs in 2 second(s) 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.079 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.338 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.338 12:03:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:58.338 [2024-11-27 12:03:26.991861] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:58.338 [2024-11-27 12:03:26.991931] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1722753 ] 00:07:58.338 [2024-11-27 12:03:27.171362] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.338 [2024-11-27 12:03:27.193266] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.597 [2024-11-27 12:03:27.245835] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.597 [2024-11-27 12:03:27.262189] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:58.597 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.597 INFO: Seed: 1026375788 00:07:58.597 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:58.597 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:58.597 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:58.597 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.597 #2 INITED exec/s: 0 rss: 65Mb 00:07:58.597 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.597 This may also happen if the target rejected all inputs we tried so far 00:07:58.597 [2024-11-27 12:03:27.311135] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:58.597 [2024-11-27 12:03:27.311263] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:58.597 [2024-11-27 12:03:27.311381] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:58.597 [2024-11-27 12:03:27.311495] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:58.597 [2024-11-27 12:03:27.311730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.597 [2024-11-27 12:03:27.311761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.597 [2024-11-27 12:03:27.311819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.597 [2024-11-27 12:03:27.311833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.597 [2024-11-27 12:03:27.311889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.597 [2024-11-27 12:03:27.311902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.597 [2024-11-27 12:03:27.311955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.597 [2024-11-27 12:03:27.311969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.856 NEW_FUNC[1/715]: 0x459f48 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:58.856 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.856 #3 NEW cov: 12234 ft: 12234 corp: 2/27b lim: 30 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:58.856 [2024-11-27 12:03:27.632050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.856 [2024-11-27 12:03:27.632084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.856 [2024-11-27 12:03:27.632134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.856 [2024-11-27 12:03:27.632148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.856 #4 NEW cov: 12380 ft: 13481 corp: 3/42b lim: 30 exec/s: 0 rss: 72Mb L: 15/26 MS: 1 InsertRepeatedBytes- 00:07:58.856 [2024-11-27 12:03:27.671906] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:58.856 [2024-11-27 12:03:27.672219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.856 [2024-11-27 12:03:27.672246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.856 [2024-11-27 12:03:27.672297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.856 [2024-11-27 12:03:27.672311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.856 [2024-11-27 12:03:27.672361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.856 [2024-11-27 12:03:27.672375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.856 #5 NEW cov: 12392 ft: 13896 corp: 4/61b lim: 30 exec/s: 0 rss: 72Mb L: 19/26 MS: 1 InsertRepeatedBytes- 00:07:58.856 [2024-11-27 12:03:27.731954] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000510a 00:07:58.856 [2024-11-27 12:03:27.732160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.856 [2024-11-27 12:03:27.732186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.116 #10 NEW cov: 12477 ft: 14430 corp: 5/67b lim: 30 exec/s: 0 rss: 72Mb L: 6/26 MS: 5 CrossOver-ShuffleBytes-CrossOver-CopyPart-InsertByte- 00:07:59.116 [2024-11-27 12:03:27.772214] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:59.116 [2024-11-27 12:03:27.772607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.772633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.116 [2024-11-27 12:03:27.772685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.772699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.116 [2024-11-27 12:03:27.772747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.772760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.116 [2024-11-27 12:03:27.772809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.772822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.116 #11 NEW cov: 12477 ft: 14564 corp: 6/96b lim: 30 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 CopyPart- 00:07:59.116 [2024-11-27 12:03:27.832307] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (172) > len (4) 00:07:59.116 [2024-11-27 12:03:27.832514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.832540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.116 [2024-11-27 12:03:27.832591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.832609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.116 #12 NEW cov: 12477 ft: 14650 corp: 7/112b lim: 30 exec/s: 0 rss: 72Mb L: 16/29 MS: 1 EraseBytes- 00:07:59.116 [2024-11-27 12:03:27.892349] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa0a 00:07:59.116 [2024-11-27 12:03:27.892558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.892603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.116 #13 NEW cov: 12477 ft: 14705 corp: 8/120b lim: 30 exec/s: 0 rss: 72Mb L: 8/29 MS: 1 CMP- DE: "\000\000"- 00:07:59.116 [2024-11-27 12:03:27.952651] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:59.116 [2024-11-27 12:03:27.952860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.952886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.116 [2024-11-27 12:03:27.952939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.116 [2024-11-27 12:03:27.952952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.116 #14 NEW cov: 12485 ft: 14753 corp: 9/136b lim: 30 exec/s: 0 rss: 72Mb L: 16/29 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:59.375 [2024-11-27 12:03:28.012821] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:59.375 [2024-11-27 12:03:28.013115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.013140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.375 [2024-11-27 12:03:28.013192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.013205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.375 [2024-11-27 12:03:28.013254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.013268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.375 #15 NEW cov: 12485 ft: 14788 corp: 10/155b lim: 30 exec/s: 0 rss: 72Mb L: 19/29 MS: 1 ChangeBit- 00:07:59.375 [2024-11-27 12:03:28.052796] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:59.375 [2024-11-27 12:03:28.053105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:000083fe cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.053132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.375 [2024-11-27 12:03:28.053182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.053197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.375 #16 NEW cov: 12485 ft: 14857 corp: 11/170b lim: 30 exec/s: 0 rss: 73Mb L: 15/29 MS: 1 CMP- DE: "\376\377\377\377"- 00:07:59.375 [2024-11-27 12:03:28.092968] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x8b8b 00:07:59.375 [2024-11-27 12:03:28.093082] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (929328) > buf size (4096) 00:07:59.375 [2024-11-27 12:03:28.093205] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:59.375 [2024-11-27 12:03:28.093507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.093533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.375 [2024-11-27 12:03:28.093589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b838b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.093607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.375 [2024-11-27 12:03:28.093656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.375 [2024-11-27 12:03:28.093669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.093720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.093734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.376 #17 NEW cov: 12485 ft: 14910 corp: 12/195b lim: 30 exec/s: 0 rss: 73Mb L: 25/29 MS: 1 InsertRepeatedBytes- 00:07:59.376 [2024-11-27 12:03:28.153116] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:07:59.376 [2024-11-27 12:03:28.153232] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44204) > len (4) 00:07:59.376 [2024-11-27 12:03:28.153338] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (176820) > buf size (4096) 00:07:59.376 [2024-11-27 12:03:28.153551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.153577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.153629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.153645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.153697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:acac0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.153710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.376 #18 NEW cov: 12485 ft: 14941 corp: 13/214b lim: 30 exec/s: 0 rss: 73Mb L: 19/29 MS: 1 CrossOver- 00:07:59.376 [2024-11-27 12:03:28.193238] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:07:59.376 [2024-11-27 12:03:28.193367] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787456) > buf size (4096) 00:07:59.376 [2024-11-27 12:03:28.193674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:000083fe cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.193701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.193755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.193770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.193820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.193838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.376 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:59.376 #19 NEW cov: 12508 ft: 14999 corp: 14/233b lim: 30 exec/s: 0 rss: 73Mb L: 19/29 MS: 1 CopyPart- 00:07:59.376 [2024-11-27 12:03:28.253509] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:59.376 [2024-11-27 12:03:28.253827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:002000ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.253854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.253905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.253920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.376 [2024-11-27 12:03:28.253969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.376 [2024-11-27 12:03:28.253983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.635 #20 NEW cov: 12508 ft: 15015 corp: 15/252b lim: 30 exec/s: 0 rss: 73Mb L: 19/29 MS: 1 ShuffleBytes- 00:07:59.635 [2024-11-27 12:03:28.293670] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:59.635 [2024-11-27 12:03:28.294076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.294103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.294156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.294171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.294220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.294234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.294285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.294298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.635 #21 NEW cov: 12508 ft: 15039 corp: 16/281b lim: 30 exec/s: 21 rss: 73Mb L: 29/29 MS: 1 ChangeByte- 00:07:59.635 [2024-11-27 12:03:28.333654] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:07:59.635 [2024-11-27 12:03:28.333784] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:59.635 [2024-11-27 12:03:28.333891] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (176820) > buf size (4096) 00:07:59.635 [2024-11-27 12:03:28.334097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.334124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.334181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.334195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.334246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:acac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.334259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.635 #22 NEW cov: 12508 ft: 15090 corp: 17/301b lim: 30 exec/s: 22 rss: 73Mb L: 20/29 MS: 1 InsertByte- 00:07:59.635 [2024-11-27 12:03:28.394080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.394106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.394156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.394170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.635 #23 NEW cov: 12508 ft: 15162 corp: 18/316b lim: 30 exec/s: 23 rss: 73Mb L: 15/29 MS: 1 ShuffleBytes- 00:07:59.635 [2024-11-27 12:03:28.433927] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:07:59.635 [2024-11-27 12:03:28.434044] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44204) > len (4) 00:07:59.635 [2024-11-27 12:03:28.434163] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (176820) > buf size (4096) 00:07:59.635 [2024-11-27 12:03:28.434360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.434387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.434441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.434456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.434509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:acac0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.434522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.635 #24 NEW cov: 12508 ft: 15176 corp: 19/336b lim: 30 exec/s: 24 rss: 73Mb L: 20/29 MS: 1 InsertByte- 00:07:59.635 [2024-11-27 12:03:28.474129] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787452) > buf size (4096) 00:07:59.635 [2024-11-27 12:03:28.474338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.474364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.635 [2024-11-27 12:03:28.474417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00fe83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.635 [2024-11-27 12:03:28.474431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.635 #25 NEW cov: 12508 ft: 15180 corp: 20/351b lim: 30 exec/s: 25 rss: 73Mb L: 15/29 MS: 1 PersAutoDict- DE: "\376\377\377\377"- 00:07:59.896 [2024-11-27 12:03:28.534257] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787452) > buf size (4096) 00:07:59.896 [2024-11-27 12:03:28.534378] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44032) > len (692) 00:07:59.896 [2024-11-27 12:03:28.534778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00fe83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.534804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.534855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.534869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.534921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.534934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.534983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.534996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.896 #26 NEW cov: 12508 ft: 15198 corp: 21/380b lim: 30 exec/s: 26 rss: 73Mb L: 29/29 MS: 1 PersAutoDict- DE: "\376\377\377\377"- 00:07:59.896 [2024-11-27 12:03:28.594304] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000510a 00:07:59.896 [2024-11-27 12:03:28.594517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.594550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.896 #27 NEW cov: 12508 ft: 15213 corp: 22/386b lim: 30 exec/s: 27 rss: 73Mb L: 6/29 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:59.896 [2024-11-27 12:03:28.634471] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:07:59.896 [2024-11-27 12:03:28.634585] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:07:59.896 [2024-11-27 12:03:28.634818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.634844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.634897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.634911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.896 #28 NEW cov: 12508 ft: 15219 corp: 23/398b lim: 30 exec/s: 28 rss: 73Mb L: 12/29 MS: 1 EraseBytes- 00:07:59.896 [2024-11-27 12:03:28.674614] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:07:59.896 [2024-11-27 12:03:28.674727] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44716) > len (4) 00:07:59.896 [2024-11-27 12:03:28.674833] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (176820) > buf size (4096) 00:07:59.896 [2024-11-27 12:03:28.675029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.675055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.675107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.675125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.675176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:acac0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.675189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.896 #29 NEW cov: 12508 ft: 15236 corp: 24/417b lim: 30 exec/s: 29 rss: 73Mb L: 19/29 MS: 1 ChangeBit- 00:07:59.896 [2024-11-27 12:03:28.714779] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:59.896 [2024-11-27 12:03:28.714896] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (44204) > len (4) 00:07:59.896 [2024-11-27 12:03:28.715205] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:07:59.896 [2024-11-27 12:03:28.715410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00fe83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.715437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.715489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000000ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.715504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.715554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.715568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.715619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.715633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.715683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:002700ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.715697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.896 #30 NEW cov: 12508 ft: 15312 corp: 25/447b lim: 30 exec/s: 30 rss: 73Mb L: 30/30 MS: 1 InsertByte- 00:07:59.896 [2024-11-27 12:03:28.774933] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x8b8b 00:07:59.896 [2024-11-27 12:03:28.775052] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (929328) > buf size (4096) 00:07:59.896 [2024-11-27 12:03:28.775159] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (211636) > buf size (4096) 00:07:59.896 [2024-11-27 12:03:28.775446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.775472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.775523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b838b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.775538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.775588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ceac00ac cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.775607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.896 [2024-11-27 12:03:28.775661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.896 [2024-11-27 12:03:28.775675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.155 #31 NEW cov: 12508 ft: 15328 corp: 26/472b lim: 30 exec/s: 31 rss: 73Mb L: 25/30 MS: 1 ChangeByte- 00:08:00.155 [2024-11-27 12:03:28.834979] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a0a 00:08:00.155 [2024-11-27 12:03:28.835203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:28.835229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.155 [2024-11-27 12:03:28.875092] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a0a 00:08:00.155 [2024-11-27 12:03:28.875296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:28.875321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.155 #33 NEW cov: 12508 ft: 15338 corp: 27/478b lim: 30 exec/s: 33 rss: 73Mb L: 6/30 MS: 2 CopyPart-ShuffleBytes- 00:08:00.155 [2024-11-27 12:03:28.915253] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa0a 00:08:00.155 [2024-11-27 12:03:28.915462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:28.915488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.155 #34 NEW cov: 12508 ft: 15355 corp: 28/485b lim: 30 exec/s: 34 rss: 73Mb L: 7/30 MS: 1 EraseBytes- 00:08:00.155 [2024-11-27 12:03:28.955330] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10248) > buf size (4096) 00:08:00.155 [2024-11-27 12:03:28.955553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:28.955579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.155 #35 NEW cov: 12508 ft: 15369 corp: 29/494b lim: 30 exec/s: 35 rss: 73Mb L: 9/30 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:00.155 [2024-11-27 12:03:28.995490] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:08:00.155 [2024-11-27 12:03:28.995811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:000083fe cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:28.995838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.155 [2024-11-27 12:03:28.995889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:28.995904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.155 #36 NEW cov: 12508 ft: 15375 corp: 30/509b lim: 30 exec/s: 36 rss: 73Mb L: 15/30 MS: 1 ChangeBinInt- 00:08:00.155 [2024-11-27 12:03:29.035698] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.155 [2024-11-27 12:03:29.035816] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.155 [2024-11-27 12:03:29.035925] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:08:00.155 [2024-11-27 12:03:29.036029] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:08:00.155 [2024-11-27 12:03:29.036239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:29.036264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.155 [2024-11-27 12:03:29.036316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:29.036330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.155 [2024-11-27 12:03:29.036379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff000c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:29.036393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.155 [2024-11-27 12:03:29.036442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.155 [2024-11-27 12:03:29.036456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.415 #42 NEW cov: 12508 ft: 15427 corp: 31/535b lim: 30 exec/s: 42 rss: 74Mb L: 26/30 MS: 1 CMP- DE: "\014\000\000\000\000\000\000\000"- 00:08:00.415 [2024-11-27 12:03:29.095859] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (172) > len (4) 00:08:00.415 [2024-11-27 12:03:29.096085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.096111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.415 [2024-11-27 12:03:29.096165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.096179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.415 #43 NEW cov: 12508 ft: 15428 corp: 32/551b lim: 30 exec/s: 43 rss: 74Mb L: 16/30 MS: 1 ChangeBinInt- 00:08:00.415 [2024-11-27 12:03:29.155972] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.415 [2024-11-27 12:03:29.156085] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:00.415 [2024-11-27 12:03:29.156193] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786432) > buf size (4096) 00:08:00.415 [2024-11-27 12:03:29.156297] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:08:00.415 [2024-11-27 12:03:29.156501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.156528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.415 [2024-11-27 12:03:29.156578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.156592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.415 [2024-11-27 12:03:29.156652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff020c cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.156666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.415 [2024-11-27 12:03:29.156717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.156736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.415 #44 NEW cov: 12508 ft: 15454 corp: 33/577b lim: 30 exec/s: 44 rss: 74Mb L: 26/30 MS: 1 ChangeBinInt- 00:08:00.415 [2024-11-27 12:03:29.216055] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:08:00.415 [2024-11-27 12:03:29.216284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.216310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.415 #45 NEW cov: 12508 ft: 15485 corp: 34/587b lim: 30 exec/s: 45 rss: 74Mb L: 10/30 MS: 1 CrossOver- 00:08:00.415 [2024-11-27 12:03:29.256139] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000510a 00:08:00.415 [2024-11-27 12:03:29.256353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.415 [2024-11-27 12:03:29.256377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.415 #46 NEW cov: 12508 ft: 15594 corp: 35/594b lim: 30 exec/s: 23 rss: 74Mb L: 7/30 MS: 1 InsertByte- 00:08:00.415 #46 DONE cov: 12508 ft: 15594 corp: 35/594b lim: 30 exec/s: 23 rss: 74Mb 00:08:00.415 ###### Recommended dictionary. ###### 00:08:00.415 "\000\000" # Uses: 1 00:08:00.415 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:00.415 "\376\377\377\377" # Uses: 2 00:08:00.415 "\014\000\000\000\000\000\000\000" # Uses: 0 00:08:00.415 ###### End of recommended dictionary. ###### 00:08:00.415 Done 46 runs in 2 second(s) 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.673 12:03:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:08:00.673 [2024-11-27 12:03:29.457761] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:00.673 [2024-11-27 12:03:29.457836] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1723120 ] 00:08:00.932 [2024-11-27 12:03:29.632374] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.932 [2024-11-27 12:03:29.654082] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.932 [2024-11-27 12:03:29.706360] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.932 [2024-11-27 12:03:29.722749] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:00.932 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.932 INFO: Seed: 3488391837 00:08:00.932 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:00.932 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:00.932 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:00.932 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.932 #2 INITED exec/s: 0 rss: 66Mb 00:08:00.932 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.932 This may also happen if the target rejected all inputs we tried so far 00:08:00.932 [2024-11-27 12:03:29.789935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.932 [2024-11-27 12:03:29.789971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.932 [2024-11-27 12:03:29.790096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.932 [2024-11-27 12:03:29.790118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.932 [2024-11-27 12:03:29.790243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.932 [2024-11-27 12:03:29.790261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.932 [2024-11-27 12:03:29.790387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.932 [2024-11-27 12:03:29.790406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.932 [2024-11-27 12:03:29.790529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.932 [2024-11-27 12:03:29.790548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.451 NEW_FUNC[1/714]: 0x45c9f8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:01.451 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.451 #4 NEW cov: 12191 ft: 12174 corp: 2/36b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:01.451 [2024-11-27 12:03:30.140958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.141010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.141146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.141171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.141306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.141327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.141459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.141482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.141621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.141645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.451 #5 NEW cov: 12304 ft: 12965 corp: 3/71b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:08:01.451 [2024-11-27 12:03:30.210830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.210862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.210979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22004522 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.210998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.211127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.211145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.211263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.211280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.211403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.211422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.451 #6 NEW cov: 12310 ft: 13137 corp: 4/106b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ChangeByte- 00:08:01.451 [2024-11-27 12:03:30.249718] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:01.451 [2024-11-27 12:03:30.250257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f0f00000 cdw11:f000f0f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.250295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.250414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f0f000f0 cdw11:0000f0f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.250433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.451 #9 NEW cov: 12406 ft: 14030 corp: 5/123b lim: 35 exec/s: 0 rss: 72Mb L: 17/35 MS: 3 CMP-InsertByte-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:08:01.451 [2024-11-27 12:03:30.309921] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:01.451 [2024-11-27 12:03:30.310406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f0f00000 cdw11:f000f0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.310442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.451 [2024-11-27 12:03:30.310556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f0f000f0 cdw11:0000f0f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.451 [2024-11-27 12:03:30.310574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.710 #10 NEW cov: 12406 ft: 14238 corp: 6/140b lim: 35 exec/s: 0 rss: 72Mb L: 17/35 MS: 1 ChangeBit- 00:08:01.711 [2024-11-27 12:03:30.380756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.380785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.380905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.380924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.381044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.381061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.711 #11 NEW cov: 12406 ft: 14519 corp: 7/161b lim: 35 exec/s: 0 rss: 73Mb L: 21/35 MS: 1 EraseBytes- 00:08:01.711 [2024-11-27 12:03:30.451542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.451572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.451687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.451705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.451820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.451839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.451951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.451968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.452091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.452110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.711 #12 NEW cov: 12406 ft: 14559 corp: 8/196b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:01.711 [2024-11-27 12:03:30.521735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.521767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.521883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.521906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.522028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:220022c5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.522044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.522161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.522177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.522302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.522320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.711 #13 NEW cov: 12406 ft: 14640 corp: 9/231b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ChangeByte- 00:08:01.711 [2024-11-27 12:03:30.571827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.571863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.571989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.572008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.572126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:93002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.572144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.572266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.572285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.711 [2024-11-27 12:03:30.572411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.711 [2024-11-27 12:03:30.572430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.970 #14 NEW cov: 12406 ft: 14710 corp: 10/266b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ChangeByte- 00:08:01.970 [2024-11-27 12:03:30.642188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.970 [2024-11-27 12:03:30.642219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.970 [2024-11-27 12:03:30.642336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22004522 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.970 [2024-11-27 12:03:30.642355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.970 [2024-11-27 12:03:30.642477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.970 [2024-11-27 12:03:30.642495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.970 [2024-11-27 12:03:30.642625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.970 [2024-11-27 12:03:30.642643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.970 [2024-11-27 12:03:30.642771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.970 [2024-11-27 12:03:30.642790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.970 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:01.970 #15 NEW cov: 12429 ft: 14776 corp: 11/301b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:01.970 [2024-11-27 12:03:30.692217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:62002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.970 [2024-11-27 12:03:30.692245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.692365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.692383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.692503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.692520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.692645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.692662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.692782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.692797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.971 #16 NEW cov: 12429 ft: 14803 corp: 12/336b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ChangeBit- 00:08:01.971 [2024-11-27 12:03:30.741672] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:01.971 [2024-11-27 12:03:30.742478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.742507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.742632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.742653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.742780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.742798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.742919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.742941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.743055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.743072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.971 #17 NEW cov: 12429 ft: 14868 corp: 13/371b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:01.971 [2024-11-27 12:03:30.792401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.792430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.792547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.792566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.792692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.792708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.792828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.792847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.971 #18 NEW cov: 12429 ft: 14879 corp: 14/403b lim: 35 exec/s: 18 rss: 73Mb L: 32/35 MS: 1 EraseBytes- 00:08:01.971 [2024-11-27 12:03:30.842615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2222004b cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.842644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.842765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.842784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.842904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.842921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.843040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.843058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.971 [2024-11-27 12:03:30.843174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.971 [2024-11-27 12:03:30.843190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.230 #19 NEW cov: 12429 ft: 14916 corp: 15/438b lim: 35 exec/s: 19 rss: 73Mb L: 35/35 MS: 1 ChangeByte- 00:08:02.230 [2024-11-27 12:03:30.892804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.892836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.892953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.892971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.893090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.893108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.893227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.893246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.893373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.893390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.230 #20 NEW cov: 12429 ft: 14951 corp: 16/473b lim: 35 exec/s: 20 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:08:02.230 [2024-11-27 12:03:30.932939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:dd0022de SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.932967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.933083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dddd00dd cdw11:2200dddb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.933100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.933211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.933227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.933353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.933372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.230 [2024-11-27 12:03:30.933492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:30.933511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.230 #21 NEW cov: 12429 ft: 14996 corp: 17/508b lim: 35 exec/s: 21 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:02.230 [2024-11-27 12:03:31.002204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4b2200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:31.002233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.230 #25 NEW cov: 12429 ft: 15315 corp: 18/518b lim: 35 exec/s: 25 rss: 73Mb L: 10/35 MS: 4 CrossOver-ShuffleBytes-ShuffleBytes-CrossOver- 00:08:02.230 [2024-11-27 12:03:31.073057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fc220098 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.230 [2024-11-27 12:03:31.073091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.073221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.073238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.073362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.073379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.073495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.073513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.231 #27 NEW cov: 12429 ft: 15383 corp: 19/552b lim: 35 exec/s: 27 rss: 73Mb L: 34/35 MS: 2 ChangeByte-CrossOver- 00:08:02.231 [2024-11-27 12:03:31.112817] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:02.231 [2024-11-27 12:03:31.113471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.113500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.113621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.113639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.113759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.113779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.113891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.113910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.231 [2024-11-27 12:03:31.114033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.231 [2024-11-27 12:03:31.114051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.490 #28 NEW cov: 12429 ft: 15412 corp: 20/587b lim: 35 exec/s: 28 rss: 73Mb L: 35/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:02.490 [2024-11-27 12:03:31.162688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4b2200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.162717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.490 #29 NEW cov: 12429 ft: 15434 corp: 21/598b lim: 35 exec/s: 29 rss: 73Mb L: 11/35 MS: 1 InsertByte- 00:08:02.490 [2024-11-27 12:03:31.233839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.233869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.233982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.234004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.234130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.234147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.234273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.234292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.234419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.234438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.490 #30 NEW cov: 12429 ft: 15446 corp: 22/633b lim: 35 exec/s: 30 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:08:02.490 [2024-11-27 12:03:31.283485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.283516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.283645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.283661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.283784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.283804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.490 #31 NEW cov: 12429 ft: 15481 corp: 23/656b lim: 35 exec/s: 31 rss: 73Mb L: 23/35 MS: 1 EraseBytes- 00:08:02.490 [2024-11-27 12:03:31.354232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.354263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.354381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.354401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.354519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.354536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.354671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.354691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.490 [2024-11-27 12:03:31.354825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:2200222a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.490 [2024-11-27 12:03:31.354846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.749 #32 NEW cov: 12429 ft: 15504 corp: 24/691b lim: 35 exec/s: 32 rss: 73Mb L: 35/35 MS: 1 ChangeBit- 00:08:02.749 [2024-11-27 12:03:31.404108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.404139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.749 [2024-11-27 12:03:31.404257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.404275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.749 [2024-11-27 12:03:31.404392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:fc002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.404410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.749 [2024-11-27 12:03:31.404534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.404554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.749 #33 NEW cov: 12429 ft: 15524 corp: 25/720b lim: 35 exec/s: 33 rss: 73Mb L: 29/35 MS: 1 CrossOver- 00:08:02.749 [2024-11-27 12:03:31.474539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.474569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.749 [2024-11-27 12:03:31.474696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22004522 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.474715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.749 [2024-11-27 12:03:31.474833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:45002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.749 [2024-11-27 12:03:31.474853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.750 [2024-11-27 12:03:31.474977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.474995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.750 [2024-11-27 12:03:31.475118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.475137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.750 #34 NEW cov: 12429 ft: 15548 corp: 26/755b lim: 35 exec/s: 34 rss: 74Mb L: 35/35 MS: 1 CopyPart- 00:08:02.750 [2024-11-27 12:03:31.544487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:e4002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.544517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.750 [2024-11-27 12:03:31.544643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.544665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.750 [2024-11-27 12:03:31.544791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:fc002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.544809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.750 [2024-11-27 12:03:31.544930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.544947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.750 #35 NEW cov: 12429 ft: 15580 corp: 27/784b lim: 35 exec/s: 35 rss: 74Mb L: 29/35 MS: 1 ChangeBinInt- 00:08:02.750 [2024-11-27 12:03:31.614050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.750 [2024-11-27 12:03:31.614081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.009 #36 NEW cov: 12429 ft: 15587 corp: 28/793b lim: 35 exec/s: 36 rss: 74Mb L: 9/35 MS: 1 CrossOver- 00:08:03.009 [2024-11-27 12:03:31.664475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.009 [2024-11-27 12:03:31.664504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.009 [2024-11-27 12:03:31.664653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.009 [2024-11-27 12:03:31.664672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.009 #37 NEW cov: 12429 ft: 15627 corp: 29/812b lim: 35 exec/s: 37 rss: 74Mb L: 19/35 MS: 1 EraseBytes- 00:08:03.009 [2024-11-27 12:03:31.734915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:222200fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.009 [2024-11-27 12:03:31.734947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.009 [2024-11-27 12:03:31.735067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.009 [2024-11-27 12:03:31.735085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.009 [2024-11-27 12:03:31.735199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:22220022 cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.009 [2024-11-27 12:03:31.735216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.009 #38 NEW cov: 12429 ft: 15651 corp: 30/835b lim: 35 exec/s: 38 rss: 74Mb L: 23/35 MS: 1 EraseBytes- 00:08:03.009 [2024-11-27 12:03:31.784542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5d600fc cdw11:22002222 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.009 [2024-11-27 12:03:31.784569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.009 #39 NEW cov: 12429 ft: 15661 corp: 31/845b lim: 35 exec/s: 19 rss: 74Mb L: 10/35 MS: 1 ChangeBinInt- 00:08:03.009 #39 DONE cov: 12429 ft: 15661 corp: 31/845b lim: 35 exec/s: 19 rss: 74Mb 00:08:03.009 ###### Recommended dictionary. ###### 00:08:03.009 "\000\000\000\000" # Uses: 2 00:08:03.009 ###### End of recommended dictionary. ###### 00:08:03.009 Done 39 runs in 2 second(s) 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.268 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.269 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.269 12:03:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:08:03.269 [2024-11-27 12:03:31.968144] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:03.269 [2024-11-27 12:03:31.968214] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1723651 ] 00:08:03.269 [2024-11-27 12:03:32.144726] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.528 [2024-11-27 12:03:32.166827] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.528 [2024-11-27 12:03:32.219071] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.528 [2024-11-27 12:03:32.235371] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:03.528 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.528 INFO: Seed: 1706411978 00:08:03.528 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:03.528 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:03.528 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:03.528 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.528 #2 INITED exec/s: 0 rss: 65Mb 00:08:03.528 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.528 This may also happen if the target rejected all inputs we tried so far 00:08:03.786 NEW_FUNC[1/707]: 0x45e6d8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:03.786 NEW_FUNC[2/707]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.786 #9 NEW cov: 12174 ft: 12173 corp: 2/11b lim: 20 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 CopyPart-CMP- DE: "\001\000\000\000\000\000\000("- 00:08:03.786 #10 NEW cov: 12287 ft: 13037 corp: 3/18b lim: 20 exec/s: 0 rss: 72Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:04.045 #11 NEW cov: 12293 ft: 13212 corp: 4/23b lim: 20 exec/s: 0 rss: 72Mb L: 5/10 MS: 1 EraseBytes- 00:08:04.045 #12 NEW cov: 12404 ft: 13924 corp: 5/41b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:08:04.045 #13 NEW cov: 12404 ft: 14027 corp: 6/49b lim: 20 exec/s: 0 rss: 72Mb L: 8/18 MS: 1 CopyPart- 00:08:04.045 [2024-11-27 12:03:32.822247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.045 [2024-11-27 12:03:32.822284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.045 NEW_FUNC[1/15]: 0x18468c8 in nvme_ctrlr_queue_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3287 00:08:04.045 NEW_FUNC[2/15]: 0x186ba68 in nvme_ctrlr_process_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3247 00:08:04.045 #14 NEW cov: 12622 ft: 14325 corp: 7/57b lim: 20 exec/s: 0 rss: 72Mb L: 8/18 MS: 1 CrossOver- 00:08:04.045 #15 NEW cov: 12622 ft: 14434 corp: 8/67b lim: 20 exec/s: 0 rss: 72Mb L: 10/18 MS: 1 ChangeByte- 00:08:04.304 #16 NEW cov: 12622 ft: 14445 corp: 9/78b lim: 20 exec/s: 0 rss: 72Mb L: 11/18 MS: 1 InsertRepeatedBytes- 00:08:04.304 #17 NEW cov: 12622 ft: 14527 corp: 10/86b lim: 20 exec/s: 0 rss: 72Mb L: 8/18 MS: 1 CrossOver- 00:08:04.304 #20 NEW cov: 12622 ft: 14582 corp: 11/92b lim: 20 exec/s: 0 rss: 72Mb L: 6/18 MS: 3 InsertByte-CopyPart-CMP- DE: "\001\000\000\021"- 00:08:04.304 #21 NEW cov: 12622 ft: 14630 corp: 12/98b lim: 20 exec/s: 0 rss: 73Mb L: 6/18 MS: 1 EraseBytes- 00:08:04.304 #22 NEW cov: 12622 ft: 14653 corp: 13/106b lim: 20 exec/s: 0 rss: 73Mb L: 8/18 MS: 1 CMP- DE: "\377\377\000\000"- 00:08:04.564 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:04.564 #23 NEW cov: 12645 ft: 14703 corp: 14/116b lim: 20 exec/s: 0 rss: 73Mb L: 10/18 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:04.564 #24 NEW cov: 12645 ft: 14712 corp: 15/120b lim: 20 exec/s: 0 rss: 73Mb L: 4/18 MS: 1 EraseBytes- 00:08:04.564 #25 NEW cov: 12645 ft: 14721 corp: 16/131b lim: 20 exec/s: 25 rss: 73Mb L: 11/18 MS: 1 ChangeBit- 00:08:04.564 #26 NEW cov: 12645 ft: 14762 corp: 17/135b lim: 20 exec/s: 26 rss: 73Mb L: 4/18 MS: 1 CopyPart- 00:08:04.564 #27 NEW cov: 12645 ft: 14829 corp: 18/144b lim: 20 exec/s: 27 rss: 73Mb L: 9/18 MS: 1 InsertByte- 00:08:04.823 #29 NEW cov: 12645 ft: 14844 corp: 19/149b lim: 20 exec/s: 29 rss: 73Mb L: 5/18 MS: 2 CopyPart-CMP- DE: "\377\377\377\007"- 00:08:04.823 #30 NEW cov: 12645 ft: 14862 corp: 20/154b lim: 20 exec/s: 30 rss: 73Mb L: 5/18 MS: 1 InsertByte- 00:08:04.823 #31 NEW cov: 12645 ft: 14892 corp: 21/162b lim: 20 exec/s: 31 rss: 73Mb L: 8/18 MS: 1 ChangeBit- 00:08:04.823 #32 NEW cov: 12645 ft: 14969 corp: 22/172b lim: 20 exec/s: 32 rss: 73Mb L: 10/18 MS: 1 CopyPart- 00:08:04.823 #33 NEW cov: 12649 ft: 15124 corp: 23/184b lim: 20 exec/s: 33 rss: 73Mb L: 12/18 MS: 1 InsertRepeatedBytes- 00:08:04.823 #34 NEW cov: 12649 ft: 15142 corp: 24/194b lim: 20 exec/s: 34 rss: 73Mb L: 10/18 MS: 1 CrossOver- 00:08:05.081 #35 NEW cov: 12649 ft: 15193 corp: 25/202b lim: 20 exec/s: 35 rss: 73Mb L: 8/18 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:05.081 #36 NEW cov: 12649 ft: 15201 corp: 26/209b lim: 20 exec/s: 36 rss: 73Mb L: 7/18 MS: 1 CopyPart- 00:08:05.081 #37 NEW cov: 12649 ft: 15245 corp: 27/217b lim: 20 exec/s: 37 rss: 73Mb L: 8/18 MS: 1 EraseBytes- 00:08:05.081 #38 NEW cov: 12649 ft: 15288 corp: 28/235b lim: 20 exec/s: 38 rss: 73Mb L: 18/18 MS: 1 ChangeBit- 00:08:05.081 #39 NEW cov: 12649 ft: 15294 corp: 29/245b lim: 20 exec/s: 39 rss: 74Mb L: 10/18 MS: 1 ChangeByte- 00:08:05.341 #40 NEW cov: 12649 ft: 15310 corp: 30/259b lim: 20 exec/s: 40 rss: 74Mb L: 14/18 MS: 1 PersAutoDict- DE: "\377\377\000\000"- 00:08:05.341 #41 NEW cov: 12649 ft: 15348 corp: 31/276b lim: 20 exec/s: 41 rss: 74Mb L: 17/18 MS: 1 CopyPart- 00:08:05.341 [2024-11-27 12:03:34.095732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.341 [2024-11-27 12:03:34.095764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.341 #42 NEW cov: 12649 ft: 15356 corp: 32/284b lim: 20 exec/s: 42 rss: 74Mb L: 8/18 MS: 1 ChangeBinInt- 00:08:05.341 #43 NEW cov: 12649 ft: 15368 corp: 33/291b lim: 20 exec/s: 43 rss: 74Mb L: 7/18 MS: 1 ChangeBit- 00:08:05.341 #47 NEW cov: 12649 ft: 15442 corp: 34/309b lim: 20 exec/s: 47 rss: 74Mb L: 18/18 MS: 4 ChangeByte-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:05.600 #49 NEW cov: 12649 ft: 15443 corp: 35/318b lim: 20 exec/s: 24 rss: 74Mb L: 9/18 MS: 2 CopyPart-CrossOver- 00:08:05.600 #49 DONE cov: 12649 ft: 15443 corp: 35/318b lim: 20 exec/s: 24 rss: 74Mb 00:08:05.600 ###### Recommended dictionary. ###### 00:08:05.600 "\001\000\000\000\000\000\000(" # Uses: 0 00:08:05.600 "\001\000\000\021" # Uses: 0 00:08:05.600 "\377\377\000\000" # Uses: 1 00:08:05.600 "\377\377\377\377" # Uses: 1 00:08:05.600 "\377\377\377\007" # Uses: 0 00:08:05.600 ###### End of recommended dictionary. ###### 00:08:05.600 Done 49 runs in 2 second(s) 00:08:05.600 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.600 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:05.600 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.600 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:05.600 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:05.600 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.601 12:03:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:05.601 [2024-11-27 12:03:34.459814] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:05.601 [2024-11-27 12:03:34.459908] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1724000 ] 00:08:05.860 [2024-11-27 12:03:34.637357] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.860 [2024-11-27 12:03:34.659198] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.860 [2024-11-27 12:03:34.711559] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.860 [2024-11-27 12:03:34.727931] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:05.860 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.860 INFO: Seed: 4199420457 00:08:06.119 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:06.119 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:06.119 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:06.119 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.119 #2 INITED exec/s: 0 rss: 65Mb 00:08:06.119 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.119 This may also happen if the target rejected all inputs we tried so far 00:08:06.119 [2024-11-27 12:03:34.794182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:a32d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.119 [2024-11-27 12:03:34.794223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.379 NEW_FUNC[1/715]: 0x45f7d8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:06.379 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.379 #6 NEW cov: 12212 ft: 12212 corp: 2/8b lim: 35 exec/s: 0 rss: 71Mb L: 7/7 MS: 4 InsertByte-InsertByte-ShuffleBytes-CMP- DE: " \000\000\000"- 00:08:06.379 [2024-11-27 12:03:35.135306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff03 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.379 [2024-11-27 12:03:35.135348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.379 #11 NEW cov: 12325 ft: 12672 corp: 3/18b lim: 35 exec/s: 0 rss: 71Mb L: 10/10 MS: 5 ChangeByte-CopyPart-ChangeBinInt-ChangeBit-CMP- DE: "\377\003\000\000\000\000\000\000"- 00:08:06.379 [2024-11-27 12:03:35.185407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00d30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.379 [2024-11-27 12:03:35.185437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.379 #12 NEW cov: 12331 ft: 12905 corp: 4/25b lim: 35 exec/s: 0 rss: 71Mb L: 7/10 MS: 1 EraseBytes- 00:08:06.379 [2024-11-27 12:03:35.255637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:a32d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.379 [2024-11-27 12:03:35.255666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.638 #13 NEW cov: 12416 ft: 13204 corp: 5/32b lim: 35 exec/s: 0 rss: 72Mb L: 7/10 MS: 1 CopyPart- 00:08:06.638 [2024-11-27 12:03:35.325819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:ab2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.638 [2024-11-27 12:03:35.325850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.638 #14 NEW cov: 12416 ft: 13327 corp: 6/39b lim: 35 exec/s: 0 rss: 72Mb L: 7/10 MS: 1 ChangeBit- 00:08:06.638 [2024-11-27 12:03:35.375955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:ab2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.638 [2024-11-27 12:03:35.375985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.638 #15 NEW cov: 12416 ft: 13383 corp: 7/50b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 PersAutoDict- DE: " \000\000\000"- 00:08:06.638 [2024-11-27 12:03:35.446214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:28a30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.638 [2024-11-27 12:03:35.446241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.638 #16 NEW cov: 12416 ft: 13454 corp: 8/58b lim: 35 exec/s: 0 rss: 72Mb L: 8/11 MS: 1 InsertByte- 00:08:06.638 [2024-11-27 12:03:35.516454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20000e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.638 [2024-11-27 12:03:35.516486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.897 #17 NEW cov: 12416 ft: 13479 corp: 9/68b lim: 35 exec/s: 0 rss: 72Mb L: 10/11 MS: 1 CMP- DE: "\016\000"- 00:08:06.897 [2024-11-27 12:03:35.586618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20002000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.897 [2024-11-27 12:03:35.586647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.897 #18 NEW cov: 12416 ft: 13491 corp: 10/81b lim: 35 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 CrossOver- 00:08:06.897 [2024-11-27 12:03:35.636790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00007c00 cdw11:a32d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.897 [2024-11-27 12:03:35.636819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.897 #19 NEW cov: 12416 ft: 13534 corp: 11/88b lim: 35 exec/s: 0 rss: 72Mb L: 7/13 MS: 1 ChangeByte- 00:08:06.897 [2024-11-27 12:03:35.686937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:ab2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.897 [2024-11-27 12:03:35.686965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.897 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:06.897 #20 NEW cov: 12439 ft: 13580 corp: 12/97b lim: 35 exec/s: 0 rss: 72Mb L: 9/13 MS: 1 PersAutoDict- DE: "\016\000"- 00:08:06.897 [2024-11-27 12:03:35.737087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002100 cdw11:a32d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.897 [2024-11-27 12:03:35.737116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.897 #21 NEW cov: 12439 ft: 13616 corp: 13/104b lim: 35 exec/s: 0 rss: 72Mb L: 7/13 MS: 1 ChangeBit- 00:08:07.156 [2024-11-27 12:03:35.787304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20320e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.156 [2024-11-27 12:03:35.787332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.156 #22 NEW cov: 12439 ft: 13654 corp: 14/114b lim: 35 exec/s: 22 rss: 72Mb L: 10/13 MS: 1 ChangeByte- 00:08:07.156 [2024-11-27 12:03:35.857542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:212d0000 cdw11:a3000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.156 [2024-11-27 12:03:35.857572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.156 #23 NEW cov: 12439 ft: 13663 corp: 15/121b lim: 35 exec/s: 23 rss: 72Mb L: 7/13 MS: 1 ShuffleBytes- 00:08:07.156 [2024-11-27 12:03:35.927734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.156 [2024-11-27 12:03:35.927765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.156 #24 NEW cov: 12439 ft: 13769 corp: 16/130b lim: 35 exec/s: 24 rss: 72Mb L: 9/13 MS: 1 PersAutoDict- DE: "\016\000"- 00:08:07.156 [2024-11-27 12:03:35.998213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20320e00 cdw11:00200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.156 [2024-11-27 12:03:35.998243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.156 [2024-11-27 12:03:35.998379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a32d0000 cdw11:00280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.156 [2024-11-27 12:03:35.998398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.414 #25 NEW cov: 12439 ft: 14543 corp: 17/146b lim: 35 exec/s: 25 rss: 72Mb L: 16/16 MS: 1 CrossOver- 00:08:07.414 [2024-11-27 12:03:36.068126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:200e2020 cdw11:20200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.068158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.414 #30 NEW cov: 12439 ft: 14669 corp: 18/158b lim: 35 exec/s: 30 rss: 73Mb L: 12/16 MS: 5 EraseBytes-ChangeBinInt-CrossOver-CopyPart-CopyPart- 00:08:07.414 [2024-11-27 12:03:36.138961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:28a30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.138991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.414 [2024-11-27 12:03:36.139120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.139138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.414 [2024-11-27 12:03:36.139268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.139287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.414 #31 NEW cov: 12439 ft: 14895 corp: 19/184b lim: 35 exec/s: 31 rss: 73Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:07.414 [2024-11-27 12:03:36.198850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.198880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.414 [2024-11-27 12:03:36.199015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.199036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.414 #32 NEW cov: 12439 ft: 14941 corp: 20/204b lim: 35 exec/s: 32 rss: 73Mb L: 20/26 MS: 1 InsertRepeatedBytes- 00:08:07.414 [2024-11-27 12:03:36.269429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:a32d0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.269461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.414 [2024-11-27 12:03:36.269604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.269623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.414 [2024-11-27 12:03:36.269758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.414 [2024-11-27 12:03:36.269776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.414 #33 NEW cov: 12439 ft: 15015 corp: 21/225b lim: 35 exec/s: 33 rss: 73Mb L: 21/26 MS: 1 InsertRepeatedBytes- 00:08:07.673 [2024-11-27 12:03:36.318892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:a32d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.673 [2024-11-27 12:03:36.318925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.673 #34 NEW cov: 12439 ft: 15029 corp: 22/234b lim: 35 exec/s: 34 rss: 73Mb L: 9/26 MS: 1 CopyPart- 00:08:07.673 [2024-11-27 12:03:36.369130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002031 cdw11:ab2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.673 [2024-11-27 12:03:36.369160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.673 #35 NEW cov: 12439 ft: 15044 corp: 23/241b lim: 35 exec/s: 35 rss: 73Mb L: 7/26 MS: 1 ChangeByte- 00:08:07.673 [2024-11-27 12:03:36.419584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20ff2020 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.673 [2024-11-27 12:03:36.419620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.673 [2024-11-27 12:03:36.419767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:200e2020 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.673 [2024-11-27 12:03:36.419785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.673 #36 NEW cov: 12439 ft: 15058 corp: 24/256b lim: 35 exec/s: 36 rss: 73Mb L: 15/26 MS: 1 InsertRepeatedBytes- 00:08:07.673 [2024-11-27 12:03:36.489511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000020 cdw11:00a30000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.673 [2024-11-27 12:03:36.489540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.673 #37 NEW cov: 12439 ft: 15059 corp: 25/264b lim: 35 exec/s: 37 rss: 73Mb L: 8/26 MS: 1 InsertByte- 00:08:07.673 [2024-11-27 12:03:36.539650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:200e2020 cdw11:20200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.673 [2024-11-27 12:03:36.539681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.931 #38 NEW cov: 12439 ft: 15104 corp: 26/276b lim: 35 exec/s: 38 rss: 73Mb L: 12/26 MS: 1 ChangeBit- 00:08:07.931 [2024-11-27 12:03:36.590651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20000e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.590680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.931 [2024-11-27 12:03:36.590805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0affa32d cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.590825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.931 [2024-11-27 12:03:36.590945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.590963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.931 [2024-11-27 12:03:36.591099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.591116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.931 #39 NEW cov: 12439 ft: 15495 corp: 27/310b lim: 35 exec/s: 39 rss: 73Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:07.931 [2024-11-27 12:03:36.650221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:ab2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.650250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.931 [2024-11-27 12:03:36.650383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a32d2000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.650399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.931 #40 NEW cov: 12439 ft: 15545 corp: 28/324b lim: 35 exec/s: 40 rss: 73Mb L: 14/34 MS: 1 CrossOver- 00:08:07.931 [2024-11-27 12:03:36.721071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e8e80ae8 cdw11:e8e80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.721100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.931 [2024-11-27 12:03:36.721222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e8e8e8e8 cdw11:e8e80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.721239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.931 [2024-11-27 12:03:36.721371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:e8e8e8e8 cdw11:e8e80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.931 [2024-11-27 12:03:36.721388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.932 [2024-11-27 12:03:36.721522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:e8e8e8e8 cdw11:e8e80003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.932 [2024-11-27 12:03:36.721539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.932 #44 NEW cov: 12439 ft: 15566 corp: 29/352b lim: 35 exec/s: 44 rss: 73Mb L: 28/34 MS: 4 CopyPart-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:08:07.932 [2024-11-27 12:03:36.770294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:20002000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.932 [2024-11-27 12:03:36.770323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.932 #45 NEW cov: 12439 ft: 15575 corp: 30/365b lim: 35 exec/s: 22 rss: 73Mb L: 13/34 MS: 1 ChangeByte- 00:08:07.932 #45 DONE cov: 12439 ft: 15575 corp: 30/365b lim: 35 exec/s: 22 rss: 73Mb 00:08:07.932 ###### Recommended dictionary. ###### 00:08:07.932 " \000\000\000" # Uses: 1 00:08:07.932 "\377\003\000\000\000\000\000\000" # Uses: 0 00:08:07.932 "\016\000" # Uses: 2 00:08:07.932 ###### End of recommended dictionary. ###### 00:08:07.932 Done 45 runs in 2 second(s) 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.190 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.191 12:03:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:08.191 [2024-11-27 12:03:36.972619] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:08.191 [2024-11-27 12:03:36.972689] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1724472 ] 00:08:08.448 [2024-11-27 12:03:37.154659] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.448 [2024-11-27 12:03:37.176233] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.448 [2024-11-27 12:03:37.228442] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.448 [2024-11-27 12:03:37.244750] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:08.448 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.448 INFO: Seed: 2420445512 00:08:08.448 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:08.448 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:08.448 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:08.448 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.448 #2 INITED exec/s: 0 rss: 65Mb 00:08:08.448 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.448 This may also happen if the target rejected all inputs we tried so far 00:08:08.448 [2024-11-27 12:03:37.314108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.448 [2024-11-27 12:03:37.314145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.965 NEW_FUNC[1/715]: 0x461978 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:08.965 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.965 #3 NEW cov: 12212 ft: 12215 corp: 2/10b lim: 45 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "@\000\000\000\000\000\000\000"- 00:08:08.965 [2024-11-27 12:03:37.665596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.965 [2024-11-27 12:03:37.665653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.965 [2024-11-27 12:03:37.665791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.965 [2024-11-27 12:03:37.665815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.965 #4 NEW cov: 12336 ft: 13613 corp: 3/29b lim: 45 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:08:08.965 [2024-11-27 12:03:37.725153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:00040000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.965 [2024-11-27 12:03:37.725190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.965 #10 NEW cov: 12342 ft: 13721 corp: 4/38b lim: 45 exec/s: 0 rss: 73Mb L: 9/19 MS: 1 ChangeBit- 00:08:08.965 [2024-11-27 12:03:37.795435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.965 [2024-11-27 12:03:37.795467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.965 #11 NEW cov: 12427 ft: 13936 corp: 5/47b lim: 45 exec/s: 0 rss: 73Mb L: 9/19 MS: 1 ChangeByte- 00:08:08.965 [2024-11-27 12:03:37.845563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000140 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.965 [2024-11-27 12:03:37.845593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.224 #14 NEW cov: 12427 ft: 14054 corp: 6/56b lim: 45 exec/s: 0 rss: 73Mb L: 9/19 MS: 3 ChangeByte-ChangeBinInt-PersAutoDict- DE: "@\000\000\000\000\000\000\000"- 00:08:09.224 [2024-11-27 12:03:37.895718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffc4ff cdw11:fffb0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:37.895748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.224 #15 NEW cov: 12427 ft: 14091 corp: 7/65b lim: 45 exec/s: 0 rss: 73Mb L: 9/19 MS: 1 ChangeBinInt- 00:08:09.224 [2024-11-27 12:03:37.966438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffc4ff cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:37.966469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.224 [2024-11-27 12:03:37.966586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:37.966616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.224 [2024-11-27 12:03:37.966745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:37.966761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.224 #16 NEW cov: 12427 ft: 14382 corp: 8/96b lim: 45 exec/s: 0 rss: 73Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:09.224 [2024-11-27 12:03:38.036331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:542b5454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:38.036359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.224 [2024-11-27 12:03:38.036481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:38.036498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.224 #17 NEW cov: 12427 ft: 14435 corp: 9/115b lim: 45 exec/s: 0 rss: 73Mb L: 19/31 MS: 1 ChangeByte- 00:08:09.224 [2024-11-27 12:03:38.107228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:38.107257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.224 [2024-11-27 12:03:38.107378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:14141414 cdw11:14140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:38.107398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.224 [2024-11-27 12:03:38.107515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:14141414 cdw11:14140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:38.107531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.224 [2024-11-27 12:03:38.107649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:14141414 cdw11:14140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.224 [2024-11-27 12:03:38.107666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.482 #18 NEW cov: 12427 ft: 14814 corp: 10/156b lim: 45 exec/s: 0 rss: 73Mb L: 41/41 MS: 1 InsertRepeatedBytes- 00:08:09.482 [2024-11-27 12:03:38.176523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0ac4 cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.482 [2024-11-27 12:03:38.176552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.482 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:09.482 #20 NEW cov: 12450 ft: 14917 corp: 11/172b lim: 45 exec/s: 0 rss: 73Mb L: 16/41 MS: 2 CrossOver-PersAutoDict- DE: "@\000\000\000\000\000\000\000"- 00:08:09.482 [2024-11-27 12:03:38.226613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:10040000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.482 [2024-11-27 12:03:38.226641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.482 #21 NEW cov: 12450 ft: 14936 corp: 12/181b lim: 45 exec/s: 0 rss: 73Mb L: 9/41 MS: 1 ChangeBit- 00:08:09.482 [2024-11-27 12:03:38.276795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00004040 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.482 [2024-11-27 12:03:38.276825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.482 #22 NEW cov: 12450 ft: 14994 corp: 13/190b lim: 45 exec/s: 22 rss: 73Mb L: 9/41 MS: 1 PersAutoDict- DE: "@\000\000\000\000\000\000\000"- 00:08:09.482 [2024-11-27 12:03:38.326963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a39 cdw11:00000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.482 [2024-11-27 12:03:38.326992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.482 #23 NEW cov: 12450 ft: 15038 corp: 14/206b lim: 45 exec/s: 23 rss: 73Mb L: 16/41 MS: 1 ChangeBinInt- 00:08:09.740 [2024-11-27 12:03:38.397178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a39 cdw11:00000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.397206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.740 #24 NEW cov: 12450 ft: 15169 corp: 15/222b lim: 45 exec/s: 24 rss: 73Mb L: 16/41 MS: 1 ChangeByte- 00:08:09.740 [2024-11-27 12:03:38.467624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:542b5454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.467654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.740 [2024-11-27 12:03:38.467766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:542a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.467784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.740 #25 NEW cov: 12450 ft: 15247 corp: 16/241b lim: 45 exec/s: 25 rss: 74Mb L: 19/41 MS: 1 ChangeByte- 00:08:09.740 [2024-11-27 12:03:38.537888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.537918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.740 [2024-11-27 12:03:38.538036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.538067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.740 #26 NEW cov: 12450 ft: 15272 corp: 17/260b lim: 45 exec/s: 26 rss: 74Mb L: 19/41 MS: 1 ShuffleBytes- 00:08:09.740 [2024-11-27 12:03:38.588023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:54543f54 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.588051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.740 [2024-11-27 12:03:38.588165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.740 [2024-11-27 12:03:38.588182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.998 #32 NEW cov: 12450 ft: 15290 corp: 18/280b lim: 45 exec/s: 32 rss: 74Mb L: 20/41 MS: 1 InsertByte- 00:08:09.998 [2024-11-27 12:03:38.658556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffc4ff cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.658585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.658700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.658717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.658831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.658848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.998 #33 NEW cov: 12450 ft: 15350 corp: 19/311b lim: 45 exec/s: 33 rss: 74Mb L: 31/41 MS: 1 ChangeBit- 00:08:09.998 [2024-11-27 12:03:38.729131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffc4ff cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.729161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.729280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.729296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.729407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.729423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.729529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff0a00ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.729546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.998 #34 NEW cov: 12450 ft: 15399 corp: 20/351b lim: 45 exec/s: 34 rss: 74Mb L: 40/41 MS: 1 InsertRepeatedBytes- 00:08:09.998 [2024-11-27 12:03:38.778968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffc4ff cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.778998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.779110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.779126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.998 [2024-11-27 12:03:38.779237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.998 [2024-11-27 12:03:38.779252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.999 #35 NEW cov: 12450 ft: 15407 corp: 21/386b lim: 45 exec/s: 35 rss: 74Mb L: 35/41 MS: 1 CMP- DE: "\000@\000\000"- 00:08:09.999 [2024-11-27 12:03:38.828778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a39 cdw11:00000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.999 [2024-11-27 12:03:38.828807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.999 [2024-11-27 12:03:38.828924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:bfff0000 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.999 [2024-11-27 12:03:38.828952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.999 #36 NEW cov: 12450 ft: 15444 corp: 22/410b lim: 45 exec/s: 36 rss: 74Mb L: 24/41 MS: 1 CopyPart- 00:08:09.999 [2024-11-27 12:03:38.878967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.999 [2024-11-27 12:03:38.878998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.999 [2024-11-27 12:03:38.879103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.999 [2024-11-27 12:03:38.879119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.257 #37 NEW cov: 12450 ft: 15453 corp: 23/433b lim: 45 exec/s: 37 rss: 74Mb L: 23/41 MS: 1 CrossOver- 00:08:10.257 [2024-11-27 12:03:38.929113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:38.929143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.257 [2024-11-27 12:03:38.929261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:38.929280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.257 #39 NEW cov: 12450 ft: 15605 corp: 24/453b lim: 45 exec/s: 39 rss: 74Mb L: 20/41 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:10.257 [2024-11-27 12:03:38.978927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a39 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:38.978957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.257 #40 NEW cov: 12450 ft: 15619 corp: 25/469b lim: 45 exec/s: 40 rss: 74Mb L: 16/41 MS: 1 ChangeBit- 00:08:10.257 [2024-11-27 12:03:39.029126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:25000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:39.029155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.257 #41 NEW cov: 12450 ft: 15652 corp: 26/478b lim: 45 exec/s: 41 rss: 74Mb L: 9/41 MS: 1 ShuffleBytes- 00:08:10.257 [2024-11-27 12:03:39.079548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:542b5454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:39.079578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.257 [2024-11-27 12:03:39.079722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54540002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:39.079741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.257 #42 NEW cov: 12450 ft: 15667 corp: 27/497b lim: 45 exec/s: 42 rss: 74Mb L: 19/41 MS: 1 CopyPart- 00:08:10.257 [2024-11-27 12:03:39.130058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:25000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:39.130089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.257 [2024-11-27 12:03:39.130209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:43434343 cdw11:43430002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.257 [2024-11-27 12:03:39.130227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.258 [2024-11-27 12:03:39.130349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:43434343 cdw11:43430002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.258 [2024-11-27 12:03:39.130366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.517 #43 NEW cov: 12450 ft: 15675 corp: 28/530b lim: 45 exec/s: 43 rss: 74Mb L: 33/41 MS: 1 InsertRepeatedBytes- 00:08:10.517 [2024-11-27 12:03:39.199705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:04000010 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.517 [2024-11-27 12:03:39.199738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.517 #44 NEW cov: 12450 ft: 15708 corp: 29/544b lim: 45 exec/s: 44 rss: 74Mb L: 14/41 MS: 1 CopyPart- 00:08:10.517 [2024-11-27 12:03:39.270162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.517 [2024-11-27 12:03:39.270192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.517 [2024-11-27 12:03:39.270306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.517 [2024-11-27 12:03:39.270323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.517 #45 NEW cov: 12450 ft: 15795 corp: 30/564b lim: 45 exec/s: 22 rss: 74Mb L: 20/41 MS: 1 ChangeBinInt- 00:08:10.517 #45 DONE cov: 12450 ft: 15795 corp: 30/564b lim: 45 exec/s: 22 rss: 74Mb 00:08:10.517 ###### Recommended dictionary. ###### 00:08:10.517 "@\000\000\000\000\000\000\000" # Uses: 3 00:08:10.517 "\000@\000\000" # Uses: 0 00:08:10.517 ###### End of recommended dictionary. ###### 00:08:10.517 Done 45 runs in 2 second(s) 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:10.777 12:03:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:10.777 [2024-11-27 12:03:39.475980] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:10.777 [2024-11-27 12:03:39.476067] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1725001 ] 00:08:10.777 [2024-11-27 12:03:39.652086] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.037 [2024-11-27 12:03:39.674279] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.037 [2024-11-27 12:03:39.726625] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.037 [2024-11-27 12:03:39.742949] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:11.037 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.037 INFO: Seed: 622477103 00:08:11.037 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:11.037 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:11.037 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:11.037 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.037 #2 INITED exec/s: 0 rss: 65Mb 00:08:11.037 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.037 This may also happen if the target rejected all inputs we tried so far 00:08:11.037 [2024-11-27 12:03:39.792215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:11.037 [2024-11-27 12:03:39.792244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.295 NEW_FUNC[1/713]: 0x464188 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:11.295 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.295 #4 NEW cov: 12140 ft: 12126 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 ShuffleBytes-InsertByte- 00:08:11.295 [2024-11-27 12:03:40.143907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e80a cdw11:00000000 00:08:11.295 [2024-11-27 12:03:40.143956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.295 #5 NEW cov: 12253 ft: 12947 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:11.554 [2024-11-27 12:03:40.193962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.193992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.554 #6 NEW cov: 12259 ft: 13120 corp: 4/7b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CrossOver- 00:08:11.554 [2024-11-27 12:03:40.264336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.264366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.554 [2024-11-27 12:03:40.264472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006969 cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.264491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.554 #7 NEW cov: 12344 ft: 13671 corp: 5/11b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 CopyPart- 00:08:11.554 [2024-11-27 12:03:40.324303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.324332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.554 [2024-11-27 12:03:40.324441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006969 cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.324459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.554 #8 NEW cov: 12344 ft: 13767 corp: 6/15b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:11.554 [2024-11-27 12:03:40.394718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.394749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.554 [2024-11-27 12:03:40.394857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:11.554 [2024-11-27 12:03:40.394875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.554 #9 NEW cov: 12344 ft: 13854 corp: 7/20b lim: 10 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 CrossOver- 00:08:11.812 [2024-11-27 12:03:40.444699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ee8 cdw11:00000000 00:08:11.812 [2024-11-27 12:03:40.444729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.812 #13 NEW cov: 12344 ft: 13997 corp: 8/23b lim: 10 exec/s: 0 rss: 72Mb L: 3/5 MS: 4 ChangeBit-ShuffleBytes-ShuffleBytes-CrossOver- 00:08:11.812 [2024-11-27 12:03:40.495783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 00:08:11.812 [2024-11-27 12:03:40.495811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.812 [2024-11-27 12:03:40.495932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.495950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.813 [2024-11-27 12:03:40.496066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.496084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.813 [2024-11-27 12:03:40.496196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.496215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.813 [2024-11-27 12:03:40.496316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.496332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.813 #14 NEW cov: 12344 ft: 14350 corp: 9/33b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:11.813 [2024-11-27 12:03:40.535187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e80a cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.535215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.813 [2024-11-27 12:03:40.535326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80a cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.535343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.813 #15 NEW cov: 12344 ft: 14402 corp: 10/37b lim: 10 exec/s: 0 rss: 72Mb L: 4/10 MS: 1 CrossOver- 00:08:11.813 [2024-11-27 12:03:40.585067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ee8 cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.585096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.813 #16 NEW cov: 12344 ft: 14435 corp: 11/40b lim: 10 exec/s: 0 rss: 72Mb L: 3/10 MS: 1 CrossOver- 00:08:11.813 [2024-11-27 12:03:40.655655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.655684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.813 [2024-11-27 12:03:40.655785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002169 cdw11:00000000 00:08:11.813 [2024-11-27 12:03:40.655804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.813 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:11.813 #17 NEW cov: 12367 ft: 14488 corp: 12/44b lim: 10 exec/s: 0 rss: 72Mb L: 4/10 MS: 1 ChangeByte- 00:08:12.072 [2024-11-27 12:03:40.706386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e800 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.706416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.706533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.706551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.706663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000003a cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.706680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.706784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.706804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.706917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.706934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.072 #18 NEW cov: 12367 ft: 14520 corp: 13/54b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeByte- 00:08:12.072 [2024-11-27 12:03:40.776144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ee8 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.776174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.776291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.776310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.776422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.776440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.072 #19 NEW cov: 12367 ft: 14659 corp: 14/61b lim: 10 exec/s: 19 rss: 72Mb L: 7/10 MS: 1 CrossOver- 00:08:12.072 [2024-11-27 12:03:40.826111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.826140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.826265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.826283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.072 #20 NEW cov: 12367 ft: 14686 corp: 15/66b lim: 10 exec/s: 20 rss: 73Mb L: 5/10 MS: 1 EraseBytes- 00:08:12.072 [2024-11-27 12:03:40.896694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a78 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.896723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.896837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007878 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.896854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.896959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007878 cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.896977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.072 [2024-11-27 12:03:40.897087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000780a cdw11:00000000 00:08:12.072 [2024-11-27 12:03:40.897106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.072 #21 NEW cov: 12367 ft: 14718 corp: 16/74b lim: 10 exec/s: 21 rss: 73Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:08:12.331 [2024-11-27 12:03:40.956672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ec8 cdw11:00000000 00:08:12.331 [2024-11-27 12:03:40.956700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.331 [2024-11-27 12:03:40.956808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:12.331 [2024-11-27 12:03:40.956828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.331 [2024-11-27 12:03:40.956940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:12.331 [2024-11-27 12:03:40.956957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.331 #22 NEW cov: 12367 ft: 14751 corp: 17/81b lim: 10 exec/s: 22 rss: 73Mb L: 7/10 MS: 1 ChangeBit- 00:08:12.331 [2024-11-27 12:03:41.006298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a24 cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.006326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.331 #23 NEW cov: 12367 ft: 14761 corp: 18/83b lim: 10 exec/s: 23 rss: 73Mb L: 2/10 MS: 1 ChangeByte- 00:08:12.331 [2024-11-27 12:03:41.046520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.046549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.331 #24 NEW cov: 12367 ft: 14841 corp: 19/85b lim: 10 exec/s: 24 rss: 73Mb L: 2/10 MS: 1 CopyPart- 00:08:12.331 [2024-11-27 12:03:41.097055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e8ff cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.097084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.331 [2024-11-27 12:03:41.097187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.097203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.331 [2024-11-27 12:03:41.097312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.097328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.331 #25 NEW cov: 12367 ft: 14853 corp: 20/92b lim: 10 exec/s: 25 rss: 73Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:12.331 [2024-11-27 12:03:41.136786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ee8 cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.136812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.331 #26 NEW cov: 12367 ft: 14876 corp: 21/95b lim: 10 exec/s: 26 rss: 73Mb L: 3/10 MS: 1 ChangeBit- 00:08:12.331 [2024-11-27 12:03:41.206978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000824 cdw11:00000000 00:08:12.331 [2024-11-27 12:03:41.207006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.590 #27 NEW cov: 12367 ft: 14915 corp: 22/97b lim: 10 exec/s: 27 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:08:12.590 [2024-11-27 12:03:41.277421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e80a cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.277448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.590 [2024-11-27 12:03:41.277545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e80b cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.277563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.590 #28 NEW cov: 12367 ft: 14926 corp: 23/101b lim: 10 exec/s: 28 rss: 73Mb L: 4/10 MS: 1 CrossOver- 00:08:12.590 [2024-11-27 12:03:41.347646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e83b cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.347677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.590 [2024-11-27 12:03:41.347790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000ae8 cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.347806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.590 #29 NEW cov: 12367 ft: 14939 corp: 24/106b lim: 10 exec/s: 29 rss: 73Mb L: 5/10 MS: 1 InsertByte- 00:08:12.590 [2024-11-27 12:03:41.397794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e8e8 cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.397821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.590 [2024-11-27 12:03:41.397933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000ae8 cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.397950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.590 #30 NEW cov: 12367 ft: 14947 corp: 25/110b lim: 10 exec/s: 30 rss: 73Mb L: 4/10 MS: 1 CopyPart- 00:08:12.590 [2024-11-27 12:03:41.467779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:12.590 [2024-11-27 12:03:41.467808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.848 #31 NEW cov: 12367 ft: 14961 corp: 26/113b lim: 10 exec/s: 31 rss: 73Mb L: 3/10 MS: 1 EraseBytes- 00:08:12.849 [2024-11-27 12:03:41.538589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.538621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.849 [2024-11-27 12:03:41.538735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.538752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.849 [2024-11-27 12:03:41.538862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.538879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.849 [2024-11-27 12:03:41.538995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.539013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.849 #35 NEW cov: 12367 ft: 14995 corp: 27/122b lim: 10 exec/s: 35 rss: 73Mb L: 9/10 MS: 4 EraseBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:12.849 [2024-11-27 12:03:41.588124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a2a cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.588154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.849 #39 NEW cov: 12367 ft: 15013 corp: 28/124b lim: 10 exec/s: 39 rss: 73Mb L: 2/10 MS: 4 EraseBytes-ShuffleBytes-ChangeBit-CopyPart- 00:08:12.849 [2024-11-27 12:03:41.638480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e8e8 cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.638508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.849 [2024-11-27 12:03:41.638627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000031e8 cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.638644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.849 #40 NEW cov: 12367 ft: 15034 corp: 29/128b lim: 10 exec/s: 40 rss: 73Mb L: 4/10 MS: 1 ChangeByte- 00:08:12.849 [2024-11-27 12:03:41.708856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e8ff cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.708884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.849 [2024-11-27 12:03:41.709002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff1e cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.709019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.849 [2024-11-27 12:03:41.709120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:00000000 00:08:12.849 [2024-11-27 12:03:41.709136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.108 #41 NEW cov: 12367 ft: 15061 corp: 30/135b lim: 10 exec/s: 41 rss: 73Mb L: 7/10 MS: 1 CMP- DE: "\036\000"- 00:08:13.108 [2024-11-27 12:03:41.778711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a5d cdw11:00000000 00:08:13.108 [2024-11-27 12:03:41.778738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.108 #42 NEW cov: 12367 ft: 15065 corp: 31/137b lim: 10 exec/s: 21 rss: 73Mb L: 2/10 MS: 1 InsertByte- 00:08:13.108 #42 DONE cov: 12367 ft: 15065 corp: 31/137b lim: 10 exec/s: 21 rss: 73Mb 00:08:13.108 ###### Recommended dictionary. ###### 00:08:13.108 "\036\000" # Uses: 0 00:08:13.108 ###### End of recommended dictionary. ###### 00:08:13.108 Done 42 runs in 2 second(s) 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.108 12:03:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:13.108 [2024-11-27 12:03:41.958188] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:13.108 [2024-11-27 12:03:41.958273] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1725298 ] 00:08:13.367 [2024-11-27 12:03:42.141477] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.367 [2024-11-27 12:03:42.163356] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.367 [2024-11-27 12:03:42.215641] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.367 [2024-11-27 12:03:42.231950] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:13.367 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.367 INFO: Seed: 3112496957 00:08:13.626 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:13.626 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:13.626 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:13.626 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.626 #2 INITED exec/s: 0 rss: 65Mb 00:08:13.626 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.626 This may also happen if the target rejected all inputs we tried so far 00:08:13.626 [2024-11-27 12:03:42.308609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:13.626 [2024-11-27 12:03:42.308647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.885 NEW_FUNC[1/713]: 0x464b88 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:13.885 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.885 #3 NEW cov: 12140 ft: 12140 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:13.885 [2024-11-27 12:03:42.649488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:13.885 [2024-11-27 12:03:42.649535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.885 [2024-11-27 12:03:42.649651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:13.885 [2024-11-27 12:03:42.649672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.885 #4 NEW cov: 12253 ft: 12989 corp: 3/7b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 CopyPart- 00:08:13.885 [2024-11-27 12:03:42.719248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001f00 cdw11:00000000 00:08:13.885 [2024-11-27 12:03:42.719277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.885 #5 NEW cov: 12259 ft: 13302 corp: 4/9b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 CMP- DE: "\037\000"- 00:08:13.885 [2024-11-27 12:03:42.769523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002f1f cdw11:00000000 00:08:13.885 [2024-11-27 12:03:42.769552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.144 #8 NEW cov: 12344 ft: 13622 corp: 5/12b lim: 10 exec/s: 0 rss: 72Mb L: 3/4 MS: 3 EraseBytes-ChangeBit-PersAutoDict- DE: "\037\000"- 00:08:14.144 [2024-11-27 12:03:42.809868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:14.144 [2024-11-27 12:03:42.809895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.144 [2024-11-27 12:03:42.810010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003d0a cdw11:00000000 00:08:14.144 [2024-11-27 12:03:42.810028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.144 #9 NEW cov: 12344 ft: 13696 corp: 6/16b lim: 10 exec/s: 0 rss: 73Mb L: 4/4 MS: 1 ChangeBit- 00:08:14.144 [2024-11-27 12:03:42.879833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3f cdw11:00000000 00:08:14.144 [2024-11-27 12:03:42.879862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.144 #10 NEW cov: 12344 ft: 13733 corp: 7/18b lim: 10 exec/s: 0 rss: 73Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:14.144 [2024-11-27 12:03:42.930458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:14.145 [2024-11-27 12:03:42.930485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.145 [2024-11-27 12:03:42.930592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cdcd cdw11:00000000 00:08:14.145 [2024-11-27 12:03:42.930614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.145 [2024-11-27 12:03:42.930747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000cd3f cdw11:00000000 00:08:14.145 [2024-11-27 12:03:42.930772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.145 #11 NEW cov: 12344 ft: 13941 corp: 8/25b lim: 10 exec/s: 0 rss: 73Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:08:14.145 [2024-11-27 12:03:42.970253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3f cdw11:00000000 00:08:14.145 [2024-11-27 12:03:42.970280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.145 [2024-11-27 12:03:42.970387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.145 [2024-11-27 12:03:42.970404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.145 #12 NEW cov: 12344 ft: 13991 corp: 9/30b lim: 10 exec/s: 0 rss: 73Mb L: 5/7 MS: 1 CrossOver- 00:08:14.404 [2024-11-27 12:03:43.040701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.040732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.404 [2024-11-27 12:03:43.040839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.040862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.404 [2024-11-27 12:03:43.040973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000cdcd cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.040993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.404 #13 NEW cov: 12344 ft: 14051 corp: 10/36b lim: 10 exec/s: 0 rss: 73Mb L: 6/7 MS: 1 CrossOver- 00:08:14.404 [2024-11-27 12:03:43.090683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a1f cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.090712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.404 [2024-11-27 12:03:43.090817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000cd cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.090835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.404 #14 NEW cov: 12344 ft: 14183 corp: 11/41b lim: 10 exec/s: 0 rss: 73Mb L: 5/7 MS: 1 PersAutoDict- DE: "\037\000"- 00:08:14.404 [2024-11-27 12:03:43.160834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.160861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.404 [2024-11-27 12:03:43.160974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cd0a cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.160991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.404 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:14.404 #15 NEW cov: 12367 ft: 14256 corp: 12/45b lim: 10 exec/s: 0 rss: 73Mb L: 4/7 MS: 1 CrossOver- 00:08:14.404 [2024-11-27 12:03:43.210760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e300 cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.210786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.404 #16 NEW cov: 12367 ft: 14307 corp: 13/47b lim: 10 exec/s: 0 rss: 73Mb L: 2/7 MS: 1 ChangeBinInt- 00:08:14.404 [2024-11-27 12:03:43.281248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.281275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.404 [2024-11-27 12:03:43.281381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cdcd cdw11:00000000 00:08:14.404 [2024-11-27 12:03:43.281396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.663 #17 NEW cov: 12367 ft: 14347 corp: 14/52b lim: 10 exec/s: 17 rss: 73Mb L: 5/7 MS: 1 EraseBytes- 00:08:14.663 [2024-11-27 12:03:43.351612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f1f cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.351639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.663 [2024-11-27 12:03:43.351758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.351776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.663 [2024-11-27 12:03:43.351891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003d0a cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.351909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.663 #18 NEW cov: 12367 ft: 14351 corp: 15/58b lim: 10 exec/s: 18 rss: 73Mb L: 6/7 MS: 1 PersAutoDict- DE: "\037\000"- 00:08:14.663 [2024-11-27 12:03:43.421795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f1f cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.421820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.663 [2024-11-27 12:03:43.421922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.421939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.663 [2024-11-27 12:03:43.422045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003d0a cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.422062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.663 #19 NEW cov: 12367 ft: 14356 corp: 16/64b lim: 10 exec/s: 19 rss: 73Mb L: 6/7 MS: 1 ChangeBit- 00:08:14.663 [2024-11-27 12:03:43.481788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003fcd cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.481815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.663 [2024-11-27 12:03:43.481916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.481934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.663 #20 NEW cov: 12367 ft: 14373 corp: 17/69b lim: 10 exec/s: 20 rss: 73Mb L: 5/7 MS: 1 ShuffleBytes- 00:08:14.663 [2024-11-27 12:03:43.531804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000430a cdw11:00000000 00:08:14.663 [2024-11-27 12:03:43.531833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.922 #21 NEW cov: 12367 ft: 14381 corp: 18/72b lim: 10 exec/s: 21 rss: 73Mb L: 3/7 MS: 1 InsertByte- 00:08:14.922 [2024-11-27 12:03:43.582211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.582240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.922 [2024-11-27 12:03:43.582351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.582368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.922 #22 NEW cov: 12367 ft: 14396 corp: 19/77b lim: 10 exec/s: 22 rss: 74Mb L: 5/7 MS: 1 ShuffleBytes- 00:08:14.922 [2024-11-27 12:03:43.653022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.653053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.922 [2024-11-27 12:03:43.653160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000acd cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.653178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.922 [2024-11-27 12:03:43.653284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000cd0a cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.653300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.922 [2024-11-27 12:03:43.653401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000cd0a cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.653418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.922 [2024-11-27 12:03:43.653526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000cdcd cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.653541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.922 #23 NEW cov: 12367 ft: 14662 corp: 20/87b lim: 10 exec/s: 23 rss: 74Mb L: 10/10 MS: 1 CrossOver- 00:08:14.922 [2024-11-27 12:03:43.702719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:08:14.922 [2024-11-27 12:03:43.702747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.923 [2024-11-27 12:03:43.702857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.702873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.923 [2024-11-27 12:03:43.702986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000cdcd cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.703004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.923 #24 NEW cov: 12367 ft: 14681 corp: 21/93b lim: 10 exec/s: 24 rss: 74Mb L: 6/10 MS: 1 InsertByte- 00:08:14.923 [2024-11-27 12:03:43.752621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.752650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.923 [2024-11-27 12:03:43.752755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003dd6 cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.752773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.923 #25 NEW cov: 12367 ft: 14715 corp: 22/97b lim: 10 exec/s: 25 rss: 74Mb L: 4/10 MS: 1 ChangeByte- 00:08:14.923 [2024-11-27 12:03:43.792972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3f cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.793000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.923 [2024-11-27 12:03:43.793124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002b0a cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.793142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.923 [2024-11-27 12:03:43.793257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000cdcd cdw11:00000000 00:08:14.923 [2024-11-27 12:03:43.793273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.181 #26 NEW cov: 12367 ft: 14752 corp: 23/103b lim: 10 exec/s: 26 rss: 74Mb L: 6/10 MS: 1 InsertByte- 00:08:15.181 [2024-11-27 12:03:43.842755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:15.181 [2024-11-27 12:03:43.842783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.181 #27 NEW cov: 12367 ft: 14762 corp: 24/106b lim: 10 exec/s: 27 rss: 74Mb L: 3/10 MS: 1 CopyPart- 00:08:15.181 [2024-11-27 12:03:43.893242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f1f cdw11:00000000 00:08:15.181 [2024-11-27 12:03:43.893270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.181 [2024-11-27 12:03:43.893378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:15.181 [2024-11-27 12:03:43.893395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.181 [2024-11-27 12:03:43.893508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002d0a cdw11:00000000 00:08:15.181 [2024-11-27 12:03:43.893524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.181 #28 NEW cov: 12367 ft: 14774 corp: 25/112b lim: 10 exec/s: 28 rss: 74Mb L: 6/10 MS: 1 ChangeBit- 00:08:15.181 [2024-11-27 12:03:43.942995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a1f cdw11:00000000 00:08:15.181 [2024-11-27 12:03:43.943024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.181 #29 NEW cov: 12367 ft: 14813 corp: 26/115b lim: 10 exec/s: 29 rss: 74Mb L: 3/10 MS: 1 EraseBytes- 00:08:15.181 [2024-11-27 12:03:44.013416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001f00 cdw11:00000000 00:08:15.181 [2024-11-27 12:03:44.013446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.181 [2024-11-27 12:03:44.013557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002f1f cdw11:00000000 00:08:15.181 [2024-11-27 12:03:44.013574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.181 #30 NEW cov: 12367 ft: 14818 corp: 27/120b lim: 10 exec/s: 30 rss: 74Mb L: 5/10 MS: 1 CopyPart- 00:08:15.440 [2024-11-27 12:03:44.073550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.073578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.440 [2024-11-27 12:03:44.073692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cd1f cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.073710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.440 #31 NEW cov: 12367 ft: 14868 corp: 28/125b lim: 10 exec/s: 31 rss: 74Mb L: 5/10 MS: 1 PersAutoDict- DE: "\037\000"- 00:08:15.440 [2024-11-27 12:03:44.123542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001f00 cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.123569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.440 #32 NEW cov: 12367 ft: 14944 corp: 29/127b lim: 10 exec/s: 32 rss: 74Mb L: 2/10 MS: 1 EraseBytes- 00:08:15.440 [2024-11-27 12:03:44.173611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001f00 cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.173641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.440 #33 NEW cov: 12367 ft: 14961 corp: 30/129b lim: 10 exec/s: 33 rss: 74Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:15.440 [2024-11-27 12:03:44.224076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e32f cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.224103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.440 [2024-11-27 12:03:44.224217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001f00 cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.224234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.440 #36 NEW cov: 12367 ft: 14976 corp: 31/133b lim: 10 exec/s: 36 rss: 74Mb L: 4/10 MS: 3 EraseBytes-ShuffleBytes-CrossOver- 00:08:15.440 [2024-11-27 12:03:44.284425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f3f cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.284453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.440 [2024-11-27 12:03:44.284568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cd0a cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.284583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.440 [2024-11-27 12:03:44.284700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000cd0a cdw11:00000000 00:08:15.440 [2024-11-27 12:03:44.284716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.440 #37 NEW cov: 12367 ft: 15000 corp: 32/139b lim: 10 exec/s: 18 rss: 74Mb L: 6/10 MS: 1 InsertByte- 00:08:15.440 #37 DONE cov: 12367 ft: 15000 corp: 32/139b lim: 10 exec/s: 18 rss: 74Mb 00:08:15.440 ###### Recommended dictionary. ###### 00:08:15.440 "\037\000" # Uses: 4 00:08:15.440 ###### End of recommended dictionary. ###### 00:08:15.440 Done 37 runs in 2 second(s) 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:15.699 12:03:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:15.699 [2024-11-27 12:03:44.487824] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:15.699 [2024-11-27 12:03:44.487894] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1725826 ] 00:08:15.959 [2024-11-27 12:03:44.662637] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.959 [2024-11-27 12:03:44.684925] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.959 [2024-11-27 12:03:44.737572] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.959 [2024-11-27 12:03:44.753876] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:15.959 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.959 INFO: Seed: 1340503919 00:08:15.959 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:15.959 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:15.959 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:15.959 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.959 [2024-11-27 12:03:44.830284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.959 [2024-11-27 12:03:44.830317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.217 #2 INITED cov: 12143 ft: 12144 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:16.217 [2024-11-27 12:03:44.880361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:44.880390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.217 #3 NEW cov: 12280 ft: 12693 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeBit- 00:08:16.217 [2024-11-27 12:03:44.950927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:44.950958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.217 [2024-11-27 12:03:44.951096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:44.951114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.217 #4 NEW cov: 12286 ft: 13661 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:08:16.217 [2024-11-27 12:03:45.000742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:45.000771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.217 #5 NEW cov: 12371 ft: 14112 corp: 4/5b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeBit- 00:08:16.217 [2024-11-27 12:03:45.051211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:45.051240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.217 [2024-11-27 12:03:45.051381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:45.051400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.217 #6 NEW cov: 12371 ft: 14261 corp: 5/7b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:08:16.217 [2024-11-27 12:03:45.101505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:45.101534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.217 [2024-11-27 12:03:45.101673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.217 [2024-11-27 12:03:45.101693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.476 #7 NEW cov: 12371 ft: 14301 corp: 6/9b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeByte- 00:08:16.476 [2024-11-27 12:03:45.171587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.476 [2024-11-27 12:03:45.171622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.476 [2024-11-27 12:03:45.171769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.476 [2024-11-27 12:03:45.171789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.476 #8 NEW cov: 12371 ft: 14376 corp: 7/11b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CopyPart- 00:08:16.476 [2024-11-27 12:03:45.241593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.476 [2024-11-27 12:03:45.241629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.476 #9 NEW cov: 12371 ft: 14455 corp: 8/12b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 EraseBytes- 00:08:16.476 [2024-11-27 12:03:45.311790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.476 [2024-11-27 12:03:45.311820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.476 #10 NEW cov: 12371 ft: 14480 corp: 9/13b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ShuffleBytes- 00:08:16.735 [2024-11-27 12:03:45.382711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.382743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.382878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.382898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.383031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.383050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.735 #11 NEW cov: 12371 ft: 14696 corp: 10/16b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 CopyPart- 00:08:16.735 [2024-11-27 12:03:45.453271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.453301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.453421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.453440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.453578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.453602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.453744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.453764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.735 #12 NEW cov: 12371 ft: 15054 corp: 11/20b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:16.735 [2024-11-27 12:03:45.503080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.503110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.503257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.503279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.735 [2024-11-27 12:03:45.503423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.503441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.735 #13 NEW cov: 12371 ft: 15100 corp: 12/23b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 ShuffleBytes- 00:08:16.735 [2024-11-27 12:03:45.572694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.735 [2024-11-27 12:03:45.572724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.735 #14 NEW cov: 12371 ft: 15141 corp: 13/24b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeByte- 00:08:16.994 [2024-11-27 12:03:45.643203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.994 [2024-11-27 12:03:45.643233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.994 [2024-11-27 12:03:45.643367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.994 [2024-11-27 12:03:45.643384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.994 #15 NEW cov: 12371 ft: 15189 corp: 14/26b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 ChangeBit- 00:08:16.994 [2024-11-27 12:03:45.693669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.994 [2024-11-27 12:03:45.693697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.994 [2024-11-27 12:03:45.693831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.995 [2024-11-27 12:03:45.693849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.995 [2024-11-27 12:03:45.693974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.995 [2024-11-27 12:03:45.693991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.253 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:17.253 #16 NEW cov: 12394 ft: 15220 corp: 15/29b lim: 5 exec/s: 16 rss: 73Mb L: 3/4 MS: 1 CrossOver- 00:08:17.253 [2024-11-27 12:03:46.003656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.253 [2024-11-27 12:03:46.003691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.253 #17 NEW cov: 12394 ft: 15297 corp: 16/30b lim: 5 exec/s: 17 rss: 73Mb L: 1/4 MS: 1 EraseBytes- 00:08:17.254 [2024-11-27 12:03:46.063782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.254 [2024-11-27 12:03:46.063811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.254 #18 NEW cov: 12394 ft: 15410 corp: 17/31b lim: 5 exec/s: 18 rss: 73Mb L: 1/4 MS: 1 EraseBytes- 00:08:17.254 [2024-11-27 12:03:46.104075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.254 [2024-11-27 12:03:46.104106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.254 [2024-11-27 12:03:46.104229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.254 [2024-11-27 12:03:46.104248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.254 #19 NEW cov: 12394 ft: 15478 corp: 18/33b lim: 5 exec/s: 19 rss: 73Mb L: 2/4 MS: 1 ChangeBit- 00:08:17.513 [2024-11-27 12:03:46.153973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.513 [2024-11-27 12:03:46.154001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.513 #20 NEW cov: 12394 ft: 15500 corp: 19/34b lim: 5 exec/s: 20 rss: 73Mb L: 1/4 MS: 1 CopyPart- 00:08:17.513 [2024-11-27 12:03:46.224277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.513 [2024-11-27 12:03:46.224306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.513 #21 NEW cov: 12394 ft: 15543 corp: 20/35b lim: 5 exec/s: 21 rss: 73Mb L: 1/4 MS: 1 ChangeByte- 00:08:17.513 [2024-11-27 12:03:46.294722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.513 [2024-11-27 12:03:46.294750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.513 [2024-11-27 12:03:46.294877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.513 [2024-11-27 12:03:46.294894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.513 #22 NEW cov: 12394 ft: 15548 corp: 21/37b lim: 5 exec/s: 22 rss: 73Mb L: 2/4 MS: 1 ChangeBit- 00:08:17.513 [2024-11-27 12:03:46.334569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.513 [2024-11-27 12:03:46.334601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.513 #23 NEW cov: 12394 ft: 15570 corp: 22/38b lim: 5 exec/s: 23 rss: 73Mb L: 1/4 MS: 1 EraseBytes- 00:08:17.772 [2024-11-27 12:03:46.404747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.404775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.772 #24 NEW cov: 12394 ft: 15650 corp: 23/39b lim: 5 exec/s: 24 rss: 73Mb L: 1/4 MS: 1 ChangeByte- 00:08:17.772 [2024-11-27 12:03:46.475261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.475292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.772 [2024-11-27 12:03:46.475412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.475431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.772 #25 NEW cov: 12394 ft: 15653 corp: 24/41b lim: 5 exec/s: 25 rss: 74Mb L: 2/4 MS: 1 InsertByte- 00:08:17.772 [2024-11-27 12:03:46.545209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.545238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.772 #26 NEW cov: 12394 ft: 15663 corp: 25/42b lim: 5 exec/s: 26 rss: 74Mb L: 1/4 MS: 1 ChangeByte- 00:08:17.772 [2024-11-27 12:03:46.596149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.596178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.772 [2024-11-27 12:03:46.596300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.596317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.772 [2024-11-27 12:03:46.596437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.596454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.772 [2024-11-27 12:03:46.596573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.772 [2024-11-27 12:03:46.596590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.772 #27 NEW cov: 12394 ft: 15670 corp: 26/46b lim: 5 exec/s: 27 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:18.031 [2024-11-27 12:03:46.665683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.031 [2024-11-27 12:03:46.665715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.031 #28 NEW cov: 12394 ft: 15675 corp: 27/47b lim: 5 exec/s: 28 rss: 74Mb L: 1/4 MS: 1 ChangeBit- 00:08:18.031 [2024-11-27 12:03:46.715987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.031 [2024-11-27 12:03:46.716017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.031 [2024-11-27 12:03:46.716138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.031 [2024-11-27 12:03:46.716157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.031 #29 NEW cov: 12394 ft: 15694 corp: 28/49b lim: 5 exec/s: 29 rss: 74Mb L: 2/4 MS: 1 CopyPart- 00:08:18.031 [2024-11-27 12:03:46.786532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.031 [2024-11-27 12:03:46.786562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.031 [2024-11-27 12:03:46.786685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.031 [2024-11-27 12:03:46.786719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.031 [2024-11-27 12:03:46.786846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.031 [2024-11-27 12:03:46.786867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.031 #30 NEW cov: 12394 ft: 15723 corp: 29/52b lim: 5 exec/s: 15 rss: 74Mb L: 3/4 MS: 1 CrossOver- 00:08:18.031 #30 DONE cov: 12394 ft: 15723 corp: 29/52b lim: 5 exec/s: 15 rss: 74Mb 00:08:18.031 Done 30 runs in 2 second(s) 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.291 12:03:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:18.291 [2024-11-27 12:03:46.995368] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:18.291 [2024-11-27 12:03:46.995438] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1726287 ] 00:08:18.291 [2024-11-27 12:03:47.175478] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.551 [2024-11-27 12:03:47.197471] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.551 [2024-11-27 12:03:47.249723] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.551 [2024-11-27 12:03:47.266039] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:18.551 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.551 INFO: Seed: 3852514493 00:08:18.551 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:18.551 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:18.551 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:18.551 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.551 [2024-11-27 12:03:47.311364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.311396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.551 #2 INITED cov: 12168 ft: 12152 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:18.551 [2024-11-27 12:03:47.352020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.352046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.551 [2024-11-27 12:03:47.352102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.352115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.551 [2024-11-27 12:03:47.352167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.352181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.551 [2024-11-27 12:03:47.352232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.352245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.551 [2024-11-27 12:03:47.352296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.352310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.551 #3 NEW cov: 12281 ft: 13474 corp: 2/6b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:18.551 [2024-11-27 12:03:47.412164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.551 [2024-11-27 12:03:47.412190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.552 [2024-11-27 12:03:47.412245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-11-27 12:03:47.412259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.552 [2024-11-27 12:03:47.412313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-11-27 12:03:47.412326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.552 [2024-11-27 12:03:47.412378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-11-27 12:03:47.412391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.552 [2024-11-27 12:03:47.412442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.552 [2024-11-27 12:03:47.412455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.811 #4 NEW cov: 12287 ft: 13672 corp: 3/11b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:18.811 [2024-11-27 12:03:47.472289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.472318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.472370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.472384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.472433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.472446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.472498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.472511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.472561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.472574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.811 #5 NEW cov: 12372 ft: 14056 corp: 4/16b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:18.811 [2024-11-27 12:03:47.532452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.532477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.532531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.532545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.532606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.532620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.532693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.532706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.532761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.532775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.811 #6 NEW cov: 12372 ft: 14125 corp: 5/21b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:18.811 [2024-11-27 12:03:47.572585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.572613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.572670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.572687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.572741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.572754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.572804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.572817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.811 [2024-11-27 12:03:47.572871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.811 [2024-11-27 12:03:47.572885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.811 #7 NEW cov: 12372 ft: 14160 corp: 6/26b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBit- 00:08:18.812 [2024-11-27 12:03:47.632762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.632788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.632841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.632854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.632906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.632919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.632971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.632984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.633034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.633046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.812 #8 NEW cov: 12372 ft: 14297 corp: 7/31b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:18.812 [2024-11-27 12:03:47.692949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.692977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.693032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.693049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.693103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.693122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.693176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.693192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.812 [2024-11-27 12:03:47.693247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.812 [2024-11-27 12:03:47.693262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.071 #9 NEW cov: 12372 ft: 14330 corp: 8/36b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CopyPart- 00:08:19.071 [2024-11-27 12:03:47.732440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.732466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.071 #10 NEW cov: 12372 ft: 14426 corp: 9/37b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 CrossOver- 00:08:19.071 [2024-11-27 12:03:47.793229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.793264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.071 [2024-11-27 12:03:47.793317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.793331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.071 [2024-11-27 12:03:47.793381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.793394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.071 [2024-11-27 12:03:47.793446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.793475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.071 [2024-11-27 12:03:47.793527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.793541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.071 #11 NEW cov: 12372 ft: 14506 corp: 10/42b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CopyPart- 00:08:19.071 [2024-11-27 12:03:47.833335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.071 [2024-11-27 12:03:47.833360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.833413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.833427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.833479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.833496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.833548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.833561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.833616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.833629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.072 #12 NEW cov: 12372 ft: 14533 corp: 11/47b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:19.072 [2024-11-27 12:03:47.893320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.893345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.893396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.893409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.893459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.893473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.893526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.893539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.072 #13 NEW cov: 12372 ft: 14631 corp: 12/51b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:19.072 [2024-11-27 12:03:47.953676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.953703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.953759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.953775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.953830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.953846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.953907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.953921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.072 [2024-11-27 12:03:47.953972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.072 [2024-11-27 12:03:47.953990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.331 #14 NEW cov: 12372 ft: 14666 corp: 13/56b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeByte- 00:08:19.332 [2024-11-27 12:03:48.013214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.013239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.332 #15 NEW cov: 12372 ft: 14684 corp: 14/57b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ChangeBinInt- 00:08:19.332 [2024-11-27 12:03:48.053296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.053321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.332 #16 NEW cov: 12372 ft: 14730 corp: 15/58b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 CrossOver- 00:08:19.332 [2024-11-27 12:03:48.113783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.113808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.332 [2024-11-27 12:03:48.113861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.113874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.332 [2024-11-27 12:03:48.113926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.113938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.332 #17 NEW cov: 12372 ft: 14922 corp: 16/61b lim: 5 exec/s: 0 rss: 72Mb L: 3/5 MS: 1 EraseBytes- 00:08:19.332 [2024-11-27 12:03:48.154207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.154233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.332 [2024-11-27 12:03:48.154289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.154304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.332 [2024-11-27 12:03:48.154357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.154372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.332 [2024-11-27 12:03:48.154427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.154440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.332 [2024-11-27 12:03:48.154490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.154504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.332 #18 NEW cov: 12372 ft: 14956 corp: 17/66b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:19.332 [2024-11-27 12:03:48.193765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.332 [2024-11-27 12:03:48.193790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.850 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:19.850 #19 NEW cov: 12395 ft: 15001 corp: 18/67b lim: 5 exec/s: 19 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:19.850 [2024-11-27 12:03:48.505354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.505387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.505444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.505459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.505515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.505528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.505584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.505601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.505659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.505671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.850 #20 NEW cov: 12395 ft: 15080 corp: 19/72b lim: 5 exec/s: 20 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:08:19.850 [2024-11-27 12:03:48.545364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.545391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.545448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.545462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.545518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.545532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.545585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.545602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.545660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.545676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.850 #21 NEW cov: 12395 ft: 15137 corp: 20/77b lim: 5 exec/s: 21 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:19.850 [2024-11-27 12:03:48.585457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.585482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.585541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.585555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.585609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.585623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.585677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.585691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.850 [2024-11-27 12:03:48.585744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.585757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.850 #22 NEW cov: 12395 ft: 15141 corp: 21/82b lim: 5 exec/s: 22 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:19.850 [2024-11-27 12:03:48.624927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.850 [2024-11-27 12:03:48.624953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.850 #23 NEW cov: 12395 ft: 15196 corp: 22/83b lim: 5 exec/s: 23 rss: 73Mb L: 1/5 MS: 1 ChangeBit- 00:08:19.851 [2024-11-27 12:03:48.665694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.851 [2024-11-27 12:03:48.665721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.851 [2024-11-27 12:03:48.665778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.851 [2024-11-27 12:03:48.665791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.851 [2024-11-27 12:03:48.665847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.851 [2024-11-27 12:03:48.665861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.851 [2024-11-27 12:03:48.665915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.851 [2024-11-27 12:03:48.665929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.851 [2024-11-27 12:03:48.665987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.851 [2024-11-27 12:03:48.666004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.851 #24 NEW cov: 12395 ft: 15211 corp: 23/88b lim: 5 exec/s: 24 rss: 73Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:19.851 [2024-11-27 12:03:48.725196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.851 [2024-11-27 12:03:48.725222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.110 #25 NEW cov: 12395 ft: 15244 corp: 24/89b lim: 5 exec/s: 25 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:20.110 [2024-11-27 12:03:48.785522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.785548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.785610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.785624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.110 #26 NEW cov: 12395 ft: 15428 corp: 25/91b lim: 5 exec/s: 26 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:08:20.110 [2024-11-27 12:03:48.846185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.846210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.846268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.846283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.846338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.846351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.846406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.846420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.846472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.846486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.110 #27 NEW cov: 12395 ft: 15451 corp: 26/96b lim: 5 exec/s: 27 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:08:20.110 [2024-11-27 12:03:48.886014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.886039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.886096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.886109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.886170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.886184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.110 #28 NEW cov: 12395 ft: 15470 corp: 27/99b lim: 5 exec/s: 28 rss: 74Mb L: 3/5 MS: 1 ChangeBit- 00:08:20.110 [2024-11-27 12:03:48.946494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.946519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.946577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.946590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.946652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.946667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.946726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.946739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.110 [2024-11-27 12:03:48.946796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.110 [2024-11-27 12:03:48.946809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.110 #29 NEW cov: 12395 ft: 15497 corp: 28/104b lim: 5 exec/s: 29 rss: 74Mb L: 5/5 MS: 1 ChangeByte- 00:08:20.370 [2024-11-27 12:03:49.006667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.006693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.006751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.006765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.006820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.006834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.006891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.006904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.006960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.006973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.370 #30 NEW cov: 12395 ft: 15514 corp: 29/109b lim: 5 exec/s: 30 rss: 74Mb L: 5/5 MS: 1 CopyPart- 00:08:20.370 [2024-11-27 12:03:49.046784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.046811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.046868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.046882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.046937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.046951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.047021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.047035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.047089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.047102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.370 #31 NEW cov: 12395 ft: 15523 corp: 30/114b lim: 5 exec/s: 31 rss: 74Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:20.370 [2024-11-27 12:03:49.086909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.086935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.086994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.087008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.087064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.087077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.087132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.087145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.087201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.087215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.370 #32 NEW cov: 12395 ft: 15526 corp: 31/119b lim: 5 exec/s: 32 rss: 74Mb L: 5/5 MS: 1 ChangeByte- 00:08:20.370 [2024-11-27 12:03:49.147085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.147110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.147171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.147185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.147241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.147255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.147310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.147323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.147379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.147392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.370 #33 NEW cov: 12395 ft: 15540 corp: 32/124b lim: 5 exec/s: 33 rss: 74Mb L: 5/5 MS: 1 CrossOver- 00:08:20.370 [2024-11-27 12:03:49.187026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.187051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.187109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.187123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.187179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.187194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.370 [2024-11-27 12:03:49.187247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.187260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.370 #34 NEW cov: 12395 ft: 15545 corp: 33/128b lim: 5 exec/s: 34 rss: 74Mb L: 4/5 MS: 1 EraseBytes- 00:08:20.370 [2024-11-27 12:03:49.226620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.370 [2024-11-27 12:03:49.226645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.370 #35 NEW cov: 12395 ft: 15604 corp: 34/129b lim: 5 exec/s: 35 rss: 74Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:20.629 [2024-11-27 12:03:49.267393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.629 [2024-11-27 12:03:49.267418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.629 [2024-11-27 12:03:49.267474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.629 [2024-11-27 12:03:49.267491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.629 [2024-11-27 12:03:49.267545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.629 [2024-11-27 12:03:49.267559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.629 [2024-11-27 12:03:49.267615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.629 [2024-11-27 12:03:49.267628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.629 [2024-11-27 12:03:49.267683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.629 [2024-11-27 12:03:49.267697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.629 #36 NEW cov: 12395 ft: 15618 corp: 35/134b lim: 5 exec/s: 18 rss: 74Mb L: 5/5 MS: 1 ChangeBit- 00:08:20.629 #36 DONE cov: 12395 ft: 15618 corp: 35/134b lim: 5 exec/s: 18 rss: 74Mb 00:08:20.629 Done 36 runs in 2 second(s) 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.629 12:03:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:20.629 [2024-11-27 12:03:49.474973] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:20.629 [2024-11-27 12:03:49.475039] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1726648 ] 00:08:20.889 [2024-11-27 12:03:49.647639] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.889 [2024-11-27 12:03:49.669266] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.889 [2024-11-27 12:03:49.721533] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.889 [2024-11-27 12:03:49.737926] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:20.889 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.889 INFO: Seed: 2027565925 00:08:21.148 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:21.148 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:21.148 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:21.148 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.148 #2 INITED exec/s: 0 rss: 65Mb 00:08:21.148 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.148 This may also happen if the target rejected all inputs we tried so far 00:08:21.148 [2024-11-27 12:03:49.797297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a0e45 cdw11:45454545 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.148 [2024-11-27 12:03:49.797326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.148 [2024-11-27 12:03:49.797385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:45454545 cdw11:45454545 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.148 [2024-11-27 12:03:49.797399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.407 NEW_FUNC[1/714]: 0x466508 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:21.407 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.407 #21 NEW cov: 12191 ft: 12180 corp: 2/19b lim: 40 exec/s: 0 rss: 72Mb L: 18/18 MS: 4 CrossOver-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:21.407 [2024-11-27 12:03:50.128794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.128858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.128946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.128974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.129057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.129083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.129166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.129192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.407 #24 NEW cov: 12304 ft: 13333 corp: 3/52b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:21.407 [2024-11-27 12:03:50.178502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.178529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.178591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.178610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.178670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.178684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.178754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.178767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.407 #30 NEW cov: 12310 ft: 13598 corp: 4/85b lim: 40 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:21.407 [2024-11-27 12:03:50.238517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.238543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.238618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.238632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.238692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.238706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.407 #33 NEW cov: 12395 ft: 14042 corp: 5/114b lim: 40 exec/s: 0 rss: 73Mb L: 29/33 MS: 3 CrossOver-CopyPart-CrossOver- 00:08:21.407 [2024-11-27 12:03:50.278468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.278494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.407 [2024-11-27 12:03:50.278552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.407 [2024-11-27 12:03:50.278566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.666 #36 NEW cov: 12395 ft: 14200 corp: 6/130b lim: 40 exec/s: 0 rss: 73Mb L: 16/33 MS: 3 CopyPart-ShuffleBytes-CrossOver- 00:08:21.666 [2024-11-27 12:03:50.318858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.318884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.318943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.318958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.319012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.319026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.319088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.319101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.666 #37 NEW cov: 12395 ft: 14287 corp: 7/163b lim: 40 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 ChangeByte- 00:08:21.666 [2024-11-27 12:03:50.379017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.379042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.379099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.379112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.379170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.379184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.379239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2c2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.379252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.666 #38 NEW cov: 12395 ft: 14351 corp: 8/197b lim: 40 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 InsertByte- 00:08:21.666 [2024-11-27 12:03:50.419257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.419282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.419340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.419354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.419409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.419422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.419477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.419490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.419544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:2b2b2c2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.419557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.666 #39 NEW cov: 12395 ft: 14398 corp: 9/237b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:21.666 [2024-11-27 12:03:50.479028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.479055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.479127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.479142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.666 #40 NEW cov: 12395 ft: 14482 corp: 10/253b lim: 40 exec/s: 0 rss: 73Mb L: 16/40 MS: 1 ShuffleBytes- 00:08:21.666 [2024-11-27 12:03:50.539424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.539449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.539508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.539522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.539574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.539587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.666 [2024-11-27 12:03:50.539651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.666 [2024-11-27 12:03:50.539664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.926 #41 NEW cov: 12395 ft: 14601 corp: 11/286b lim: 40 exec/s: 0 rss: 73Mb L: 33/40 MS: 1 CrossOver- 00:08:21.926 [2024-11-27 12:03:50.579433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e2b2d cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.579458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.579519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.579533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.579589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.579608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.926 #42 NEW cov: 12395 ft: 14634 corp: 12/315b lim: 40 exec/s: 0 rss: 73Mb L: 29/40 MS: 1 ChangeByte- 00:08:21.926 [2024-11-27 12:03:50.639619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.639644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.639701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.639715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.639770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.639787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.926 #43 NEW cov: 12395 ft: 14658 corp: 13/345b lim: 40 exec/s: 0 rss: 73Mb L: 30/40 MS: 1 CrossOver- 00:08:21.926 [2024-11-27 12:03:50.679846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.679871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.679927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.679941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.679995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.680008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.680064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.680077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.926 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:21.926 #44 NEW cov: 12418 ft: 14722 corp: 14/383b lim: 40 exec/s: 0 rss: 73Mb L: 38/40 MS: 1 CopyPart- 00:08:21.926 [2024-11-27 12:03:50.739753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.739778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.739837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.739851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.926 #45 NEW cov: 12418 ft: 14756 corp: 15/403b lim: 40 exec/s: 0 rss: 73Mb L: 20/40 MS: 1 EraseBytes- 00:08:21.926 [2024-11-27 12:03:50.779849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a980e2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.779874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.926 [2024-11-27 12:03:50.779933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.926 [2024-11-27 12:03:50.779946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.186 #46 NEW cov: 12418 ft: 14786 corp: 16/420b lim: 40 exec/s: 46 rss: 73Mb L: 17/40 MS: 1 InsertByte- 00:08:22.186 [2024-11-27 12:03:50.840252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b00 cdw11:00002b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.840276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.840352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.840368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.840436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.840449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.840507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.840519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.186 #47 NEW cov: 12418 ft: 14796 corp: 17/457b lim: 40 exec/s: 47 rss: 73Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:08:22.186 [2024-11-27 12:03:50.880427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.880453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.880508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.880522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.880574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.880588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.880648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.880661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.880715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:2b2b2c2b cdw11:2bffff2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.880727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.186 #48 NEW cov: 12418 ft: 14810 corp: 18/497b lim: 40 exec/s: 48 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:08:22.186 [2024-11-27 12:03:50.940163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:cdfd8d12 cdw11:c05c9200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.940188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.186 #49 NEW cov: 12418 ft: 15115 corp: 19/506b lim: 40 exec/s: 49 rss: 74Mb L: 9/40 MS: 1 CMP- DE: "\315\375\215\022\300\\\222\000"- 00:08:22.186 [2024-11-27 12:03:50.980604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e0000 cdw11:0000002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.980629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.980687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.980701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.980759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.980776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:50.980831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:50.980844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.186 #50 NEW cov: 12418 ft: 15158 corp: 20/540b lim: 40 exec/s: 50 rss: 74Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:08:22.186 [2024-11-27 12:03:51.020640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:51.020666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:51.020724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:51.020738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:51.020794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:51.020808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.186 #51 NEW cov: 12418 ft: 15229 corp: 21/565b lim: 40 exec/s: 51 rss: 74Mb L: 25/40 MS: 1 EraseBytes- 00:08:22.186 [2024-11-27 12:03:51.060694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:51.060719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:51.060779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:51.060792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.186 [2024-11-27 12:03:51.060849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.186 [2024-11-27 12:03:51.060862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.483 #52 NEW cov: 12418 ft: 15242 corp: 22/596b lim: 40 exec/s: 52 rss: 74Mb L: 31/40 MS: 1 CrossOver- 00:08:22.483 [2024-11-27 12:03:51.101103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:722b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.101128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.101185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.101198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.101251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.101263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.101321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.101334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.101388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:2b2b2c2b cdw11:2bffff2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.101401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.161247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:722bcdfd cdw11:8d12c05c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.161272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.161331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:92002b2b cdw11:2b2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.161344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.161398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.161411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.161465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.161478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.161534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:2b2b2c2b cdw11:2bffff2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.161546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.483 #54 NEW cov: 12418 ft: 15264 corp: 23/636b lim: 40 exec/s: 54 rss: 74Mb L: 40/40 MS: 2 ChangeByte-PersAutoDict- DE: "\315\375\215\022\300\\\222\000"- 00:08:22.483 [2024-11-27 12:03:51.200860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:cdfd8d12 cdw11:c05c9225 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.200885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.483 #55 NEW cov: 12418 ft: 15286 corp: 24/645b lim: 40 exec/s: 55 rss: 74Mb L: 9/40 MS: 1 ChangeByte- 00:08:22.483 [2024-11-27 12:03:51.261353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.261379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.261436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2bcdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.261449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.261506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8d12c05c cdw11:92002b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.261520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.261580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.261593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.483 #56 NEW cov: 12418 ft: 15288 corp: 25/682b lim: 40 exec/s: 56 rss: 74Mb L: 37/40 MS: 1 PersAutoDict- DE: "\315\375\215\022\300\\\222\000"- 00:08:22.483 [2024-11-27 12:03:51.301402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a2b0e2b cdw11:2d2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.301429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.301488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.301501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.301560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.301574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.483 #57 NEW cov: 12418 ft: 15303 corp: 26/707b lim: 40 exec/s: 57 rss: 74Mb L: 25/40 MS: 1 CrossOver- 00:08:22.483 [2024-11-27 12:03:51.341359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0e2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.341383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.483 [2024-11-27 12:03:51.341440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.483 [2024-11-27 12:03:51.341454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.483 #58 NEW cov: 12418 ft: 15335 corp: 27/728b lim: 40 exec/s: 58 rss: 74Mb L: 21/40 MS: 1 EraseBytes- 00:08:22.742 [2024-11-27 12:03:51.381725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.381750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.381809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b0b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.381822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.381877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.381890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.381945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.381958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.743 #59 NEW cov: 12418 ft: 15355 corp: 28/761b lim: 40 exec/s: 59 rss: 74Mb L: 33/40 MS: 1 ChangeBit- 00:08:22.743 [2024-11-27 12:03:51.441903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0e2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.441931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.441989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.442002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.442058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.442071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.442126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.442139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.743 #60 NEW cov: 12418 ft: 15388 corp: 29/796b lim: 40 exec/s: 60 rss: 74Mb L: 35/40 MS: 1 CopyPart- 00:08:22.743 [2024-11-27 12:03:51.502062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e2b00 cdw11:00002b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.502087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.502146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.502160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.502233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.502247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.502303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b0a0e2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.502316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.743 #61 NEW cov: 12418 ft: 15396 corp: 30/830b lim: 40 exec/s: 61 rss: 74Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:08:22.743 [2024-11-27 12:03:51.561874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2b0e0000 cdw11:2b0e002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.561901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.743 #62 NEW cov: 12418 ft: 15445 corp: 31/838b lim: 40 exec/s: 62 rss: 74Mb L: 8/40 MS: 1 CrossOver- 00:08:22.743 [2024-11-27 12:03:51.622245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a2b0e2b cdw11:2d2b2bcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.622271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.622332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fd8d12c0 cdw11:5c92002b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.622347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.743 [2024-11-27 12:03:51.622403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.743 [2024-11-27 12:03:51.622420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.002 #63 NEW cov: 12418 ft: 15453 corp: 32/863b lim: 40 exec/s: 63 rss: 74Mb L: 25/40 MS: 1 PersAutoDict- DE: "\315\375\215\022\300\\\222\000"- 00:08:23.002 [2024-11-27 12:03:51.682680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:722b8dfd cdw11:8d12c05c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.682706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.682764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:92002b2b cdw11:2b2bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.682777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.682834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.682847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.682905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.682918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.682973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:2b2b2c2b cdw11:2bffff2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.682986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.002 #64 NEW cov: 12418 ft: 15459 corp: 33/903b lim: 40 exec/s: 64 rss: 75Mb L: 40/40 MS: 1 ChangeBit- 00:08:23.002 [2024-11-27 12:03:51.742754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a2b0e2b cdw11:2d2b2bcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.742778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.742839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fd8d0909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.742852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.742908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090912 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.742921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.002 [2024-11-27 12:03:51.742976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c05c9200 cdw11:2b2b2b2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.002 [2024-11-27 12:03:51.742989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.002 #65 NEW cov: 12418 ft: 15465 corp: 34/941b lim: 40 exec/s: 32 rss: 75Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:23.002 #65 DONE cov: 12418 ft: 15465 corp: 34/941b lim: 40 exec/s: 32 rss: 75Mb 00:08:23.002 ###### Recommended dictionary. ###### 00:08:23.002 "\315\375\215\022\300\\\222\000" # Uses: 3 00:08:23.002 ###### End of recommended dictionary. ###### 00:08:23.002 Done 65 runs in 2 second(s) 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.261 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.262 12:03:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:23.262 [2024-11-27 12:03:51.942138] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:23.262 [2024-11-27 12:03:51.942216] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727182 ] 00:08:23.262 [2024-11-27 12:03:52.119571] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.262 [2024-11-27 12:03:52.141355] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.521 [2024-11-27 12:03:52.193654] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.521 [2024-11-27 12:03:52.210006] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:23.521 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.521 INFO: Seed: 205580313 00:08:23.521 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:23.521 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:23.521 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:23.521 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.521 #2 INITED exec/s: 0 rss: 66Mb 00:08:23.521 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.521 This may also happen if the target rejected all inputs we tried so far 00:08:23.521 [2024-11-27 12:03:52.255831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.521 [2024-11-27 12:03:52.255859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.521 [2024-11-27 12:03:52.255916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.521 [2024-11-27 12:03:52.255934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.521 [2024-11-27 12:03:52.255992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.521 [2024-11-27 12:03:52.256006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.521 [2024-11-27 12:03:52.256061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.521 [2024-11-27 12:03:52.256074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.780 NEW_FUNC[1/714]: 0x468278 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:23.780 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.780 #3 NEW cov: 12200 ft: 12197 corp: 2/34b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:23.780 [2024-11-27 12:03:52.566268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.780 [2024-11-27 12:03:52.566299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.781 [2024-11-27 12:03:52.566356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.781 [2024-11-27 12:03:52.566369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.781 NEW_FUNC[1/1]: 0x1f63a08 in thread_update_stats /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:930 00:08:23.781 #5 NEW cov: 12316 ft: 13135 corp: 3/54b lim: 40 exec/s: 0 rss: 72Mb L: 20/33 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:23.781 [2024-11-27 12:03:52.606475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.781 [2024-11-27 12:03:52.606501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.781 [2024-11-27 12:03:52.606558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.781 [2024-11-27 12:03:52.606572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.781 [2024-11-27 12:03:52.606630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.781 [2024-11-27 12:03:52.606643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.781 #6 NEW cov: 12322 ft: 13484 corp: 4/84b lim: 40 exec/s: 0 rss: 72Mb L: 30/33 MS: 1 EraseBytes- 00:08:24.040 [2024-11-27 12:03:52.666496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.666522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.666585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.666605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.040 #7 NEW cov: 12407 ft: 13685 corp: 5/104b lim: 40 exec/s: 0 rss: 72Mb L: 20/33 MS: 1 ShuffleBytes- 00:08:24.040 [2024-11-27 12:03:52.726611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.726637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.726698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.726712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.040 #8 NEW cov: 12407 ft: 13821 corp: 6/124b lim: 40 exec/s: 0 rss: 73Mb L: 20/33 MS: 1 CopyPart- 00:08:24.040 [2024-11-27 12:03:52.787098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.787124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.787180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.787194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.787249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.787264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.787321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.787335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.040 #9 NEW cov: 12407 ft: 13954 corp: 7/162b lim: 40 exec/s: 0 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:24.040 [2024-11-27 12:03:52.847262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.847289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.847359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.847374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.847427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.847441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.847499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.847512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.040 #10 NEW cov: 12407 ft: 14006 corp: 8/200b lim: 40 exec/s: 0 rss: 73Mb L: 38/38 MS: 1 CopyPart- 00:08:24.040 [2024-11-27 12:03:52.907147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:67121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.907173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.040 [2024-11-27 12:03:52.907231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.040 [2024-11-27 12:03:52.907244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.300 #11 NEW cov: 12407 ft: 14092 corp: 9/220b lim: 40 exec/s: 0 rss: 73Mb L: 20/38 MS: 1 ChangeByte- 00:08:24.300 [2024-11-27 12:03:52.947370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:52.947397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:52.947469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:52.947484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:52.947538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:52.947551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.300 #12 NEW cov: 12407 ft: 14117 corp: 10/250b lim: 40 exec/s: 0 rss: 73Mb L: 30/38 MS: 1 ChangeBinInt- 00:08:24.300 [2024-11-27 12:03:52.987335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:52.987360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:52.987432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:52.987446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.300 #13 NEW cov: 12407 ft: 14234 corp: 11/270b lim: 40 exec/s: 0 rss: 73Mb L: 20/38 MS: 1 ShuffleBytes- 00:08:24.300 [2024-11-27 12:03:53.027464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121400 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.027490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:53.027544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.027558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.300 #14 NEW cov: 12407 ft: 14261 corp: 12/290b lim: 40 exec/s: 0 rss: 73Mb L: 20/38 MS: 1 ChangeBinInt- 00:08:24.300 [2024-11-27 12:03:53.067539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.067564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:53.067643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12181212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.067658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.300 #15 NEW cov: 12407 ft: 14339 corp: 13/310b lim: 40 exec/s: 0 rss: 73Mb L: 20/38 MS: 1 ChangeBinInt- 00:08:24.300 [2024-11-27 12:03:53.127732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.127761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:53.127831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.127846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.300 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:24.300 #16 NEW cov: 12430 ft: 14355 corp: 14/330b lim: 40 exec/s: 0 rss: 73Mb L: 20/38 MS: 1 CMP- DE: "\000\000\000\177"- 00:08:24.300 [2024-11-27 12:03:53.168167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.168191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.300 [2024-11-27 12:03:53.168262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25000000 cdw11:7f252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.300 [2024-11-27 12:03:53.168277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.301 [2024-11-27 12:03:53.168333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.301 [2024-11-27 12:03:53.168346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.301 [2024-11-27 12:03:53.168401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:2525252d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.301 [2024-11-27 12:03:53.168414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.560 #17 NEW cov: 12430 ft: 14397 corp: 15/364b lim: 40 exec/s: 0 rss: 73Mb L: 34/38 MS: 1 PersAutoDict- DE: "\000\000\000\177"- 00:08:24.560 [2024-11-27 12:03:53.228393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.228419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.228473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25000000 cdw11:7f252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.228487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.228539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.228552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.228614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:2525252d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.228628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.560 #18 NEW cov: 12430 ft: 14437 corp: 16/398b lim: 40 exec/s: 18 rss: 73Mb L: 34/38 MS: 1 CopyPart- 00:08:24.560 [2024-11-27 12:03:53.288514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.288540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.288620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25290000 cdw11:7f252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.288635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.288692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.288706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.288761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:2525252d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.288775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.560 #19 NEW cov: 12430 ft: 14490 corp: 17/432b lim: 40 exec/s: 19 rss: 73Mb L: 34/38 MS: 1 ChangeByte- 00:08:24.560 [2024-11-27 12:03:53.348373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.348399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.348457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121218 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.348470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.560 #20 NEW cov: 12430 ft: 14499 corp: 18/450b lim: 40 exec/s: 20 rss: 73Mb L: 18/38 MS: 1 EraseBytes- 00:08:24.560 [2024-11-27 12:03:53.408511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.408536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.560 [2024-11-27 12:03:53.408615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12181212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.560 [2024-11-27 12:03:53.408629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.560 #21 NEW cov: 12430 ft: 14511 corp: 19/470b lim: 40 exec/s: 21 rss: 73Mb L: 20/38 MS: 1 ShuffleBytes- 00:08:24.819 [2024-11-27 12:03:53.448863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.448888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.448945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.448959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.449013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00007f12 cdw11:0000007f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.449026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.819 #22 NEW cov: 12430 ft: 14517 corp: 20/494b lim: 40 exec/s: 22 rss: 73Mb L: 24/38 MS: 1 PersAutoDict- DE: "\000\000\000\177"- 00:08:24.819 [2024-11-27 12:03:53.508828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.508856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.508928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:18121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.508942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.819 #23 NEW cov: 12430 ft: 14543 corp: 21/512b lim: 40 exec/s: 23 rss: 73Mb L: 18/38 MS: 1 ShuffleBytes- 00:08:24.819 [2024-11-27 12:03:53.569010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.569035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.569106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.569120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.819 #24 NEW cov: 12430 ft: 14558 corp: 22/535b lim: 40 exec/s: 24 rss: 73Mb L: 23/38 MS: 1 EraseBytes- 00:08:24.819 [2024-11-27 12:03:53.609090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.609115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.609186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25290000 cdw11:7f252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.609200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.819 #25 NEW cov: 12430 ft: 14563 corp: 23/554b lim: 40 exec/s: 25 rss: 73Mb L: 19/38 MS: 1 EraseBytes- 00:08:24.819 [2024-11-27 12:03:53.669420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.669444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.669515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12123939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.669530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.819 [2024-11-27 12:03:53.669587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39391212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.819 [2024-11-27 12:03:53.669608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.078 #26 NEW cov: 12430 ft: 14652 corp: 24/584b lim: 40 exec/s: 26 rss: 73Mb L: 30/38 MS: 1 InsertRepeatedBytes- 00:08:25.078 [2024-11-27 12:03:53.729398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.729423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.729495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121218 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.729509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.078 #27 NEW cov: 12430 ft: 14663 corp: 25/602b lim: 40 exec/s: 27 rss: 73Mb L: 18/38 MS: 1 ShuffleBytes- 00:08:25.078 [2024-11-27 12:03:53.769858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.769883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.769954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25002500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.769968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.770023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00002525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.770035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.770091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.770103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.078 #28 NEW cov: 12430 ft: 14673 corp: 26/641b lim: 40 exec/s: 28 rss: 74Mb L: 39/39 MS: 1 CopyPart- 00:08:25.078 [2024-11-27 12:03:53.829698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:67121212 cdw11:123f1212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.829722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.829794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.829807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.078 #29 NEW cov: 12430 ft: 14680 corp: 27/662b lim: 40 exec/s: 29 rss: 74Mb L: 21/39 MS: 1 InsertByte- 00:08:25.078 [2024-11-27 12:03:53.890007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.890032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.890104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12181212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.890118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.890174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:12121212 cdw11:ff915cc1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.890187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.078 #30 NEW cov: 12430 ft: 14682 corp: 28/690b lim: 40 exec/s: 30 rss: 74Mb L: 28/39 MS: 1 CMP- DE: "\377\221\\\301\312\024\274\024"- 00:08:25.078 [2024-11-27 12:03:53.930184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.930209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.930266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12181212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.930283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.078 [2024-11-27 12:03:53.930339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:12121200 cdw11:00007f12 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.078 [2024-11-27 12:03:53.930353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.078 #31 NEW cov: 12430 ft: 14691 corp: 29/714b lim: 40 exec/s: 31 rss: 74Mb L: 24/39 MS: 1 PersAutoDict- DE: "\000\000\000\177"- 00:08:25.337 [2024-11-27 12:03:53.970451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.337 [2024-11-27 12:03:53.970477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.337 [2024-11-27 12:03:53.970551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25000000 cdw11:7f252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.337 [2024-11-27 12:03:53.970565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.337 [2024-11-27 12:03:53.970616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.337 [2024-11-27 12:03:53.970630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:53.970684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:2525252d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:53.970697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.338 #32 NEW cov: 12430 ft: 14697 corp: 30/749b lim: 40 exec/s: 32 rss: 74Mb L: 35/39 MS: 1 InsertByte- 00:08:25.338 [2024-11-27 12:03:54.010270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:a5671212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.010295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.010351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.010364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.338 #33 NEW cov: 12430 ft: 14708 corp: 31/770b lim: 40 exec/s: 33 rss: 74Mb L: 21/39 MS: 1 InsertByte- 00:08:25.338 [2024-11-27 12:03:54.050376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:a5671212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.050401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.050455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12122e12 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.050469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.338 #34 NEW cov: 12430 ft: 14720 corp: 32/791b lim: 40 exec/s: 34 rss: 74Mb L: 21/39 MS: 1 ChangeByte- 00:08:25.338 [2024-11-27 12:03:54.110899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:dbd92525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.110925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.110981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.110998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.111054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.111068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.111122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.111135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.338 #35 NEW cov: 12430 ft: 14781 corp: 33/824b lim: 40 exec/s: 35 rss: 74Mb L: 33/39 MS: 1 ChangeBinInt- 00:08:25.338 [2024-11-27 12:03:54.150827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.150854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.150912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.150926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.150980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:25252525 cdw11:25252525 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.150993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.338 #36 NEW cov: 12430 ft: 14835 corp: 34/854b lim: 40 exec/s: 36 rss: 74Mb L: 30/39 MS: 1 CopyPart- 00:08:25.338 [2024-11-27 12:03:54.190826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:67121212 cdw11:123f1212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.190852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.338 [2024-11-27 12:03:54.190907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.338 [2024-11-27 12:03:54.190920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.597 #37 NEW cov: 12430 ft: 14860 corp: 35/874b lim: 40 exec/s: 37 rss: 74Mb L: 20/39 MS: 1 EraseBytes- 00:08:25.597 [2024-11-27 12:03:54.251313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:67121212 cdw11:123f1212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.597 [2024-11-27 12:03:54.251338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.597 [2024-11-27 12:03:54.251392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.597 [2024-11-27 12:03:54.251405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.597 [2024-11-27 12:03:54.251459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:18121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.597 [2024-11-27 12:03:54.251472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.597 [2024-11-27 12:03:54.251527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:12121212 cdw11:12121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.597 [2024-11-27 12:03:54.251543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.597 #38 NEW cov: 12430 ft: 14887 corp: 36/910b lim: 40 exec/s: 19 rss: 74Mb L: 36/39 MS: 1 CrossOver- 00:08:25.597 #38 DONE cov: 12430 ft: 14887 corp: 36/910b lim: 40 exec/s: 19 rss: 74Mb 00:08:25.597 ###### Recommended dictionary. ###### 00:08:25.597 "\000\000\000\177" # Uses: 3 00:08:25.597 "\377\221\\\301\312\024\274\024" # Uses: 0 00:08:25.597 ###### End of recommended dictionary. ###### 00:08:25.597 Done 38 runs in 2 second(s) 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:25.597 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.598 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.598 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.598 12:03:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:25.598 [2024-11-27 12:03:54.434553] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:25.598 [2024-11-27 12:03:54.434634] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727486 ] 00:08:25.856 [2024-11-27 12:03:54.619519] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.856 [2024-11-27 12:03:54.641760] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.856 [2024-11-27 12:03:54.694050] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.856 [2024-11-27 12:03:54.710420] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:25.856 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.856 INFO: Seed: 2706606166 00:08:26.115 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:26.115 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:26.115 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:26.115 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.115 #2 INITED exec/s: 0 rss: 65Mb 00:08:26.115 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.115 This may also happen if the target rejected all inputs we tried so far 00:08:26.115 [2024-11-27 12:03:54.765880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.115 [2024-11-27 12:03:54.765908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.115 [2024-11-27 12:03:54.765978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.115 [2024-11-27 12:03:54.765992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.374 NEW_FUNC[1/714]: 0x469fe8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:26.374 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.374 #15 NEW cov: 12200 ft: 12200 corp: 2/21b lim: 40 exec/s: 0 rss: 71Mb L: 20/20 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:26.374 [2024-11-27 12:03:55.097077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.097109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 [2024-11-27 12:03:55.097164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.097177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.374 [2024-11-27 12:03:55.097232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.097246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.374 NEW_FUNC[1/1]: 0xf859f8 in spdk_get_ticks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:321 00:08:26.374 #20 NEW cov: 12314 ft: 12974 corp: 3/52b lim: 40 exec/s: 0 rss: 72Mb L: 31/31 MS: 5 CopyPart-ShuffleBytes-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:26.374 [2024-11-27 12:03:55.136958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.136984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 [2024-11-27 12:03:55.137038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.137052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.374 #21 NEW cov: 12320 ft: 13108 corp: 4/72b lim: 40 exec/s: 0 rss: 72Mb L: 20/31 MS: 1 ShuffleBytes- 00:08:26.374 [2024-11-27 12:03:55.197109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.197134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 [2024-11-27 12:03:55.197188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e51ae5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.197201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.374 #22 NEW cov: 12405 ft: 13493 corp: 5/92b lim: 40 exec/s: 0 rss: 72Mb L: 20/31 MS: 1 ChangeBinInt- 00:08:26.374 [2024-11-27 12:03:55.237202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.237228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 [2024-11-27 12:03:55.237318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e51ae5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.374 [2024-11-27 12:03:55.237332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.633 #23 NEW cov: 12405 ft: 13606 corp: 6/113b lim: 40 exec/s: 0 rss: 72Mb L: 21/31 MS: 1 InsertByte- 00:08:26.633 [2024-11-27 12:03:55.297370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.633 [2024-11-27 12:03:55.297395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.633 [2024-11-27 12:03:55.297465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.297479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.634 #24 NEW cov: 12405 ft: 13658 corp: 7/134b lim: 40 exec/s: 0 rss: 72Mb L: 21/31 MS: 1 CrossOver- 00:08:26.634 [2024-11-27 12:03:55.357861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.357886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.357941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.357955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.358005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.358035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.358089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:1ae5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.358101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.634 #25 NEW cov: 12405 ft: 14019 corp: 8/171b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 CrossOver- 00:08:26.634 [2024-11-27 12:03:55.397980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.398005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.398058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.398071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.398124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.398141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.398193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.398206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.634 #26 NEW cov: 12405 ft: 14040 corp: 9/209b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:26.634 [2024-11-27 12:03:55.457708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:bbbbbbbb cdw11:bbbbbbbb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.457733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.634 #28 NEW cov: 12405 ft: 14825 corp: 10/224b lim: 40 exec/s: 0 rss: 72Mb L: 15/38 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:26.634 [2024-11-27 12:03:55.498258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5acacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.498284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.498336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.498349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.498401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:acace5e5 cdw11:e5e51ae5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.498415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.634 [2024-11-27 12:03:55.498465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.634 [2024-11-27 12:03:55.498477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.893 #29 NEW cov: 12405 ft: 14894 corp: 11/258b lim: 40 exec/s: 0 rss: 72Mb L: 34/38 MS: 1 InsertRepeatedBytes- 00:08:26.893 [2024-11-27 12:03:55.558420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5acac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.558445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.558498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.558511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.558564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:acace5e5 cdw11:e5e51ae5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.558578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.558632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.558645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.893 #35 NEW cov: 12405 ft: 14918 corp: 12/292b lim: 40 exec/s: 0 rss: 72Mb L: 34/38 MS: 1 CopyPart- 00:08:26.893 [2024-11-27 12:03:55.618568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.618593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.618653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.618666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.618718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:e5000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.618754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.618813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.618826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.893 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:26.893 #36 NEW cov: 12428 ft: 14979 corp: 13/330b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:26.893 [2024-11-27 12:03:55.678438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.678462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.678520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.678534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.893 #37 NEW cov: 12428 ft: 14996 corp: 14/351b lim: 40 exec/s: 0 rss: 72Mb L: 21/38 MS: 1 ChangeBit- 00:08:26.893 [2024-11-27 12:03:55.718754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.718778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.718833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.718847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.893 [2024-11-27 12:03:55.718899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-27 12:03:55.718913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.893 #38 NEW cov: 12428 ft: 15026 corp: 15/382b lim: 40 exec/s: 38 rss: 73Mb L: 31/38 MS: 1 ChangeByte- 00:08:27.153 [2024-11-27 12:03:55.779091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5acacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.779116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.779171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacac00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.779187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.779242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:000000e5 cdw11:e5e51ae5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.779256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.779309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.779323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.153 #39 NEW cov: 12428 ft: 15047 corp: 16/416b lim: 40 exec/s: 39 rss: 73Mb L: 34/38 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:27.153 [2024-11-27 12:03:55.818864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.818888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.818942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.818955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.153 #40 NEW cov: 12428 ft: 15062 corp: 17/433b lim: 40 exec/s: 40 rss: 73Mb L: 17/38 MS: 1 EraseBytes- 00:08:27.153 [2024-11-27 12:03:55.858951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.858976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.859046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.859060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.153 #41 NEW cov: 12428 ft: 15082 corp: 18/452b lim: 40 exec/s: 41 rss: 73Mb L: 19/38 MS: 1 EraseBytes- 00:08:27.153 [2024-11-27 12:03:55.899354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5acac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.899379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.899434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:22acacac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.899447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.899502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:acace5e5 cdw11:e5e51ae5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.899532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.899587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.899604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.153 #42 NEW cov: 12428 ft: 15105 corp: 19/486b lim: 40 exec/s: 42 rss: 73Mb L: 34/38 MS: 1 ChangeBinInt- 00:08:27.153 [2024-11-27 12:03:55.959421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.959447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.959499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.959514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:55.959570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00001000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:55.959584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.153 #43 NEW cov: 12428 ft: 15118 corp: 20/517b lim: 40 exec/s: 43 rss: 73Mb L: 31/38 MS: 1 ChangeBit- 00:08:27.153 [2024-11-27 12:03:56.019581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:56.019611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:56.019668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:acace5e5 cdw11:e5acacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:56.019682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.153 [2024-11-27 12:03:56.019735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:22acacac cdw11:e5e5e5ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.153 [2024-11-27 12:03:56.019749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.412 #44 NEW cov: 12428 ft: 15142 corp: 21/545b lim: 40 exec/s: 44 rss: 73Mb L: 28/38 MS: 1 CrossOver- 00:08:27.412 [2024-11-27 12:03:56.080055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e549 cdw11:49494949 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.080081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.080136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:49e5e5ac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.080149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.080202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:acacacac cdw11:ac000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.080215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.080269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00e5e5e5 cdw11:1ae5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.080281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.080334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:e5e5e5e5 cdw11:e51a0326 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.080347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:27.412 #45 NEW cov: 12428 ft: 15222 corp: 22/585b lim: 40 exec/s: 45 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:27.412 [2024-11-27 12:03:56.140245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.140269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.140322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00c7c7c7 cdw11:c7c7c7c7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.140336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.140384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:c7c70000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.140398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.140449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.140462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.412 [2024-11-27 12:03:56.140515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:000000db SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.140528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:27.412 #46 NEW cov: 12428 ft: 15237 corp: 23/625b lim: 40 exec/s: 46 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:27.412 [2024-11-27 12:03:56.180161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-27 12:03:56.180186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.413 [2024-11-27 12:03:56.180237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.413 [2024-11-27 12:03:56.180251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.413 [2024-11-27 12:03:56.180300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.413 [2024-11-27 12:03:56.180331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.413 [2024-11-27 12:03:56.180386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.413 [2024-11-27 12:03:56.180398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.413 #47 NEW cov: 12428 ft: 15241 corp: 24/659b lim: 40 exec/s: 47 rss: 73Mb L: 34/40 MS: 1 CopyPart- 00:08:27.413 [2024-11-27 12:03:56.219978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.413 [2024-11-27 12:03:56.220003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.413 [2024-11-27 12:03:56.220054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e51ae5e5 cdw11:e500e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.413 [2024-11-27 12:03:56.220068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.413 #48 NEW cov: 12428 ft: 15260 corp: 25/680b lim: 40 exec/s: 48 rss: 73Mb L: 21/40 MS: 1 InsertByte- 00:08:27.413 [2024-11-27 12:03:56.259948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:bb0f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.413 [2024-11-27 12:03:56.259972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.672 #49 NEW cov: 12428 ft: 15322 corp: 26/695b lim: 40 exec/s: 49 rss: 73Mb L: 15/40 MS: 1 ChangeBinInt- 00:08:27.672 [2024-11-27 12:03:56.320624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5acac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.320649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.320704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:22acacac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.320718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.320771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:acace5e5 cdw11:e5e51ae5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.320784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.320834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.320846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.672 #50 NEW cov: 12428 ft: 15342 corp: 27/732b lim: 40 exec/s: 50 rss: 73Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:08:27.672 [2024-11-27 12:03:56.380585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:bbbbbbbb cdw11:bbbbbbbb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.380615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.380671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.380685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.380737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e5e5e5e5 cdw11:bbbbbbe5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.380751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.672 #51 NEW cov: 12428 ft: 15352 corp: 28/763b lim: 40 exec/s: 51 rss: 73Mb L: 31/40 MS: 1 CrossOver- 00:08:27.672 [2024-11-27 12:03:56.420843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.420868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.420922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.420935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.420986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e52c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.420999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.421053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.421066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.672 #52 NEW cov: 12428 ft: 15386 corp: 29/801b lim: 40 exec/s: 52 rss: 73Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:27.672 [2024-11-27 12:03:56.460631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.460656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.460708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.460722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.672 #53 NEW cov: 12428 ft: 15413 corp: 30/822b lim: 40 exec/s: 53 rss: 73Mb L: 21/40 MS: 1 ChangeBit- 00:08:27.672 [2024-11-27 12:03:56.520803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.520827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.672 [2024-11-27 12:03:56.520880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:acace5e5 cdw11:e5acacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-27 12:03:56.520893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.931 #54 NEW cov: 12428 ft: 15432 corp: 31/839b lim: 40 exec/s: 54 rss: 73Mb L: 17/40 MS: 1 EraseBytes- 00:08:27.931 [2024-11-27 12:03:56.581291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5acac cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.931 [2024-11-27 12:03:56.581316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.932 [2024-11-27 12:03:56.581372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:acacacac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.581385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.932 [2024-11-27 12:03:56.581437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:acace5e5 cdw11:e5e51ae5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.581450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.932 [2024-11-27 12:03:56.581501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e51a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.581514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.932 #55 NEW cov: 12428 ft: 15470 corp: 32/876b lim: 40 exec/s: 55 rss: 74Mb L: 37/40 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:27.932 [2024-11-27 12:03:56.640989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.641014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.932 #59 NEW cov: 12428 ft: 15479 corp: 33/887b lim: 40 exec/s: 59 rss: 74Mb L: 11/40 MS: 4 ChangeByte-CopyPart-InsertByte-CMP- DE: "\001\000\000\000\000\000\000?"- 00:08:27.932 [2024-11-27 12:03:56.681248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e529e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.681273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.932 [2024-11-27 12:03:56.681326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:acacace5 cdw11:e5e5acac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.681340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.932 #60 NEW cov: 12428 ft: 15485 corp: 34/905b lim: 40 exec/s: 60 rss: 74Mb L: 18/40 MS: 1 InsertByte- 00:08:27.932 [2024-11-27 12:03:56.741421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e5e5e5e5 cdw11:e5e5e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.741446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.932 [2024-11-27 12:03:56.741499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e51affff cdw11:0100e5e5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-27 12:03:56.741513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.932 #61 NEW cov: 12428 ft: 15528 corp: 35/926b lim: 40 exec/s: 30 rss: 74Mb L: 21/40 MS: 1 CMP- DE: "\377\377\001\000"- 00:08:27.932 #61 DONE cov: 12428 ft: 15528 corp: 35/926b lim: 40 exec/s: 30 rss: 74Mb 00:08:27.932 ###### Recommended dictionary. ###### 00:08:27.932 "\000\000\000\000" # Uses: 2 00:08:27.932 "\001\000\000\000\000\000\000?" # Uses: 0 00:08:27.932 "\377\377\001\000" # Uses: 0 00:08:27.932 ###### End of recommended dictionary. ###### 00:08:27.932 Done 61 runs in 2 second(s) 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:28.191 12:03:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:28.191 [2024-11-27 12:03:56.926013] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:28.191 [2024-11-27 12:03:56.926093] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1728000 ] 00:08:28.451 [2024-11-27 12:03:57.105683] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.451 [2024-11-27 12:03:57.127517] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.451 [2024-11-27 12:03:57.179786] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.451 [2024-11-27 12:03:57.196153] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:28.451 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.451 INFO: Seed: 897618267 00:08:28.451 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:28.451 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:28.451 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:28.451 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.451 #2 INITED exec/s: 0 rss: 65Mb 00:08:28.451 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.451 This may also happen if the target rejected all inputs we tried so far 00:08:28.451 [2024-11-27 12:03:57.255631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.451 [2024-11-27 12:03:57.255658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.451 [2024-11-27 12:03:57.255720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.451 [2024-11-27 12:03:57.255734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.451 [2024-11-27 12:03:57.255791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.451 [2024-11-27 12:03:57.255804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.451 [2024-11-27 12:03:57.255862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.451 [2024-11-27 12:03:57.255875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.710 NEW_FUNC[1/714]: 0x46bbb8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:28.710 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.710 #7 NEW cov: 12189 ft: 12190 corp: 2/34b lim: 40 exec/s: 0 rss: 71Mb L: 33/33 MS: 5 InsertByte-CrossOver-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:28.710 [2024-11-27 12:03:57.566551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.710 [2024-11-27 12:03:57.566592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.710 [2024-11-27 12:03:57.566667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.710 [2024-11-27 12:03:57.566686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.710 [2024-11-27 12:03:57.566761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.710 [2024-11-27 12:03:57.566780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.710 [2024-11-27 12:03:57.566854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.710 [2024-11-27 12:03:57.566872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.969 #13 NEW cov: 12302 ft: 12700 corp: 3/67b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeBit- 00:08:28.969 [2024-11-27 12:03:57.626563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.626591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.626656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.626670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.626728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.626742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.626801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.626814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.969 #14 NEW cov: 12308 ft: 12920 corp: 4/100b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:28.969 [2024-11-27 12:03:57.686579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.686612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.686688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.686702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.686760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.686773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.969 #15 NEW cov: 12393 ft: 13779 corp: 5/125b lim: 40 exec/s: 0 rss: 72Mb L: 25/33 MS: 1 EraseBytes- 00:08:28.969 [2024-11-27 12:03:57.726841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:808080ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.726867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.726927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff80 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.726941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.727004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.727017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.727075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.727088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.969 #16 NEW cov: 12393 ft: 13937 corp: 6/158b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:28.969 [2024-11-27 12:03:57.766783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.766809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.766868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.766882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.766938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.766952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.969 #17 NEW cov: 12393 ft: 14015 corp: 7/184b lim: 40 exec/s: 0 rss: 72Mb L: 26/33 MS: 1 CopyPart- 00:08:28.969 [2024-11-27 12:03:57.827176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.827203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.827265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.827280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.827339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.827353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.969 [2024-11-27 12:03:57.827426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.969 [2024-11-27 12:03:57.827440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.969 #18 NEW cov: 12393 ft: 14102 corp: 8/217b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:29.229 [2024-11-27 12:03:57.867218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.867245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.867319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080802c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.867338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.867396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.867410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.867469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.867483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.229 #19 NEW cov: 12393 ft: 14211 corp: 9/250b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeByte- 00:08:29.229 [2024-11-27 12:03:57.907330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.907356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.907418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.907432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.907491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.907505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.907563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.907577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.229 #20 NEW cov: 12393 ft: 14264 corp: 10/283b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeBit- 00:08:29.229 [2024-11-27 12:03:57.967557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.967583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.967646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.967660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.967717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.967730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:57.967787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:57.967800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.229 #21 NEW cov: 12393 ft: 14368 corp: 11/320b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:29.229 [2024-11-27 12:03:58.027743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.027772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.027834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.027848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.027904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.027918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.027978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:8080e317 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.027991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.229 #22 NEW cov: 12393 ft: 14397 corp: 12/353b lim: 40 exec/s: 0 rss: 72Mb L: 33/37 MS: 1 ChangeByte- 00:08:29.229 [2024-11-27 12:03:58.067634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.067660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.067742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.067756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.067814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.067827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.229 #23 NEW cov: 12393 ft: 14431 corp: 13/378b lim: 40 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 EraseBytes- 00:08:29.229 [2024-11-27 12:03:58.107878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.107904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.107961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.107974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.108032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.108046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.229 [2024-11-27 12:03:58.108105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.229 [2024-11-27 12:03:58.108119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.488 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:29.488 #24 NEW cov: 12416 ft: 14487 corp: 14/411b lim: 40 exec/s: 0 rss: 72Mb L: 33/37 MS: 1 ShuffleBytes- 00:08:29.488 [2024-11-27 12:03:58.168090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.488 [2024-11-27 12:03:58.168117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.488 [2024-11-27 12:03:58.168178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.488 [2024-11-27 12:03:58.168192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.488 [2024-11-27 12:03:58.168251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80298080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.488 [2024-11-27 12:03:58.168265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.488 [2024-11-27 12:03:58.168325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.488 [2024-11-27 12:03:58.168338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.488 #25 NEW cov: 12416 ft: 14514 corp: 15/444b lim: 40 exec/s: 0 rss: 72Mb L: 33/37 MS: 1 ChangeByte- 00:08:29.489 [2024-11-27 12:03:58.208195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.208220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.208281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.208294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.208351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.208365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.208422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.208435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.489 #26 NEW cov: 12416 ft: 14580 corp: 16/481b lim: 40 exec/s: 26 rss: 72Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:29.489 [2024-11-27 12:03:58.267985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:32ffffff cdw11:ffffffae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.268010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.489 #30 NEW cov: 12416 ft: 14972 corp: 17/490b lim: 40 exec/s: 30 rss: 73Mb L: 9/37 MS: 4 ChangeByte-PersAutoDict-ChangeByte-PersAutoDict- DE: "\377\377\377\377"-"\377\377\377\377"- 00:08:29.489 [2024-11-27 12:03:58.308651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.308677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.308736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.308754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.308813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.308826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.308885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:808088ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.308899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.308957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffff8080 cdw11:8080173b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.308970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.489 #31 NEW cov: 12416 ft: 15015 corp: 18/530b lim: 40 exec/s: 31 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:29.489 [2024-11-27 12:03:58.368675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.368701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.368762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808680 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.368776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.368835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.368848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.489 [2024-11-27 12:03:58.368907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.489 [2024-11-27 12:03:58.368920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.748 #32 NEW cov: 12416 ft: 15041 corp: 19/563b lim: 40 exec/s: 32 rss: 73Mb L: 33/40 MS: 1 ChangeBinInt- 00:08:29.748 [2024-11-27 12:03:58.408779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.408805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.408866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80ffffff cdw11:ff808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.408880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.408939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.408953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.409010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818080 cdw11:80808880 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.409027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.748 #33 NEW cov: 12416 ft: 15063 corp: 20/600b lim: 40 exec/s: 33 rss: 73Mb L: 37/40 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:29.748 [2024-11-27 12:03:58.448852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.448878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.448938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.448952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.449011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.449025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.449080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.449094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.748 #34 NEW cov: 12416 ft: 15161 corp: 21/637b lim: 40 exec/s: 34 rss: 73Mb L: 37/40 MS: 1 ShuffleBytes- 00:08:29.748 [2024-11-27 12:03:58.509101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:89808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.509127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.509186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.509199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.509258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.509272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.509329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.509342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.748 #35 NEW cov: 12416 ft: 15162 corp: 22/670b lim: 40 exec/s: 35 rss: 73Mb L: 33/40 MS: 1 ChangeBinInt- 00:08:29.748 [2024-11-27 12:03:58.549022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80800000 cdw11:00f68080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.549047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.549125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.549139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.748 [2024-11-27 12:03:58.549203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.748 [2024-11-27 12:03:58.549217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.748 #51 NEW cov: 12416 ft: 15229 corp: 23/696b lim: 40 exec/s: 51 rss: 73Mb L: 26/40 MS: 1 CMP- DE: "\000\000\000\366"- 00:08:29.749 [2024-11-27 12:03:58.609328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.749 [2024-11-27 12:03:58.609353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.749 [2024-11-27 12:03:58.609414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.749 [2024-11-27 12:03:58.609428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.749 [2024-11-27 12:03:58.609484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.749 [2024-11-27 12:03:58.609497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.749 [2024-11-27 12:03:58.609556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.749 [2024-11-27 12:03:58.609569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.749 #52 NEW cov: 12416 ft: 15247 corp: 24/733b lim: 40 exec/s: 52 rss: 73Mb L: 37/40 MS: 1 ShuffleBytes- 00:08:30.008 [2024-11-27 12:03:58.649434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.008 [2024-11-27 12:03:58.649459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.008 [2024-11-27 12:03:58.649519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.008 [2024-11-27 12:03:58.649532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.008 [2024-11-27 12:03:58.649593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.008 [2024-11-27 12:03:58.649612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.008 [2024-11-27 12:03:58.649692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.008 [2024-11-27 12:03:58.649706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.008 #53 NEW cov: 12416 ft: 15258 corp: 25/770b lim: 40 exec/s: 53 rss: 73Mb L: 37/40 MS: 1 CopyPart- 00:08:30.008 [2024-11-27 12:03:58.689551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.689577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.689639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.689653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.689711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0a808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.689724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.689781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.689794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.009 #54 NEW cov: 12416 ft: 15287 corp: 26/803b lim: 40 exec/s: 54 rss: 73Mb L: 33/40 MS: 1 CrossOver- 00:08:30.009 [2024-11-27 12:03:58.729522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.729548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.729608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.729622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.729680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80298080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.729694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.009 #55 NEW cov: 12416 ft: 15312 corp: 27/828b lim: 40 exec/s: 55 rss: 73Mb L: 25/40 MS: 1 EraseBytes- 00:08:30.009 [2024-11-27 12:03:58.789843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.789869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.789932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.789946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.790003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.790016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.790072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80818880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.790085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.009 #56 NEW cov: 12416 ft: 15348 corp: 28/861b lim: 40 exec/s: 56 rss: 73Mb L: 33/40 MS: 1 ShuffleBytes- 00:08:30.009 [2024-11-27 12:03:58.829989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.830015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.830073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.830090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.830147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0a808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.830161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.830219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:8080800a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.830233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.009 #57 NEW cov: 12416 ft: 15353 corp: 29/894b lim: 40 exec/s: 57 rss: 73Mb L: 33/40 MS: 1 CopyPart- 00:08:30.009 [2024-11-27 12:03:58.890228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.890254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.890316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.890330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.890388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80298080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.890402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.009 [2024-11-27 12:03:58.890460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.009 [2024-11-27 12:03:58.890474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.266 #58 NEW cov: 12416 ft: 15363 corp: 30/927b lim: 40 exec/s: 58 rss: 73Mb L: 33/40 MS: 1 ChangeByte- 00:08:30.266 [2024-11-27 12:03:58.930078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.266 [2024-11-27 12:03:58.930103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.266 [2024-11-27 12:03:58.930166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.266 [2024-11-27 12:03:58.930179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.266 [2024-11-27 12:03:58.930239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f6808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.266 [2024-11-27 12:03:58.930253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.266 #59 NEW cov: 12416 ft: 15389 corp: 31/957b lim: 40 exec/s: 59 rss: 73Mb L: 30/40 MS: 1 PersAutoDict- DE: "\000\000\000\366"- 00:08:30.266 [2024-11-27 12:03:58.970197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.266 [2024-11-27 12:03:58.970224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.266 [2024-11-27 12:03:58.970285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.266 [2024-11-27 12:03:58.970302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.266 [2024-11-27 12:03:58.970366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80818880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.266 [2024-11-27 12:03:58.970379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.266 #60 NEW cov: 12416 ft: 15434 corp: 32/982b lim: 40 exec/s: 60 rss: 73Mb L: 25/40 MS: 1 EraseBytes- 00:08:30.266 [2024-11-27 12:03:59.030540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.030568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.030631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080802c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.030645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.030707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.030721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.030781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:24808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.030794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.267 #61 NEW cov: 12416 ft: 15454 corp: 33/1015b lim: 40 exec/s: 61 rss: 73Mb L: 33/40 MS: 1 ChangeByte- 00:08:30.267 [2024-11-27 12:03:59.090566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.090593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.090677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.090692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.090753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.090767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.267 #62 NEW cov: 12416 ft: 15465 corp: 34/1041b lim: 40 exec/s: 62 rss: 73Mb L: 26/40 MS: 1 InsertByte- 00:08:30.267 [2024-11-27 12:03:59.150868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.150896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.150957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.150971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.151029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80298080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.151046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.267 [2024-11-27 12:03:59.151106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:80808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.267 [2024-11-27 12:03:59.151119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.525 #63 NEW cov: 12416 ft: 15469 corp: 35/1075b lim: 40 exec/s: 63 rss: 73Mb L: 34/40 MS: 1 CopyPart- 00:08:30.525 [2024-11-27 12:03:59.211055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.211082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.525 [2024-11-27 12:03:59.211142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.211156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.525 [2024-11-27 12:03:59.211219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80818080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.211233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.525 [2024-11-27 12:03:59.211308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808880 cdw11:807e8017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.211321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.525 #64 NEW cov: 12416 ft: 15479 corp: 36/1108b lim: 40 exec/s: 64 rss: 74Mb L: 33/40 MS: 1 ChangeByte- 00:08:30.525 [2024-11-27 12:03:59.251136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.251162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.525 [2024-11-27 12:03:59.251239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:8080802c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.251253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.525 [2024-11-27 12:03:59.251313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.251327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.525 [2024-11-27 12:03:59.251384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:24808017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.525 [2024-11-27 12:03:59.251398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.525 #65 NEW cov: 12416 ft: 15512 corp: 37/1141b lim: 40 exec/s: 32 rss: 74Mb L: 33/40 MS: 1 CopyPart- 00:08:30.525 #65 DONE cov: 12416 ft: 15512 corp: 37/1141b lim: 40 exec/s: 32 rss: 74Mb 00:08:30.525 ###### Recommended dictionary. ###### 00:08:30.525 "\377\377\377\377" # Uses: 7 00:08:30.525 "\000\000\000\366" # Uses: 1 00:08:30.525 ###### End of recommended dictionary. ###### 00:08:30.525 Done 65 runs in 2 second(s) 00:08:30.525 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.525 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:30.525 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.526 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:30.526 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:30.526 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.784 12:03:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:30.784 [2024-11-27 12:03:59.455916] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:30.784 [2024-11-27 12:03:59.455988] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1728487 ] 00:08:30.784 [2024-11-27 12:03:59.629603] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.784 [2024-11-27 12:03:59.651405] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.043 [2024-11-27 12:03:59.704065] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.043 [2024-11-27 12:03:59.720433] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:31.043 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.043 INFO: Seed: 3420616160 00:08:31.043 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:31.043 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:31.043 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:31.043 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.043 #2 INITED exec/s: 0 rss: 65Mb 00:08:31.043 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.043 This may also happen if the target rejected all inputs we tried so far 00:08:31.043 [2024-11-27 12:03:59.765881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST IDENTIFIER cid:4 cdw10:80000081 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.043 [2024-11-27 12:03:59.765910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.301 NEW_FUNC[1/716]: 0x46d788 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:31.301 NEW_FUNC[2/716]: 0x493668 in feat_host_identifier /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:372 00:08:31.301 #11 NEW cov: 12190 ft: 12184 corp: 2/10b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 4 CopyPart-ChangeByte-ChangeByte-CMP- DE: "\377\221\\\305\271^f\362"- 00:08:31.301 [2024-11-27 12:04:00.108077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.301 [2024-11-27 12:04:00.108154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.301 [2024-11-27 12:04:00.108323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.301 [2024-11-27 12:04:00.108364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.301 #15 NEW cov: 12306 ft: 13580 corp: 3/29b lim: 35 exec/s: 0 rss: 72Mb L: 19/19 MS: 4 ChangeBit-CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:31.301 [2024-11-27 12:04:00.167829] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.301 [2024-11-27 12:04:00.167869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.301 [2024-11-27 12:04:00.167993] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.301 [2024-11-27 12:04:00.168013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 #16 NEW cov: 12312 ft: 13749 corp: 4/48b lim: 35 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ChangeByte- 00:08:31.560 [2024-11-27 12:04:00.238011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.560 [2024-11-27 12:04:00.238041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 [2024-11-27 12:04:00.238161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.560 [2024-11-27 12:04:00.238178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 #20 NEW cov: 12404 ft: 13989 corp: 5/66b lim: 35 exec/s: 0 rss: 72Mb L: 18/19 MS: 4 CrossOver-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:31.560 [2024-11-27 12:04:00.287820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST IDENTIFIER cid:4 cdw10:80000081 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.560 [2024-11-27 12:04:00.287857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 #21 NEW cov: 12404 ft: 14102 corp: 6/75b lim: 35 exec/s: 0 rss: 72Mb L: 9/19 MS: 1 ChangeBit- 00:08:31.560 [2024-11-27 12:04:00.358334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.560 [2024-11-27 12:04:00.358364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 [2024-11-27 12:04:00.358509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.560 [2024-11-27 12:04:00.358534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 #22 NEW cov: 12404 ft: 14325 corp: 7/94b lim: 35 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ShuffleBytes- 00:08:31.560 [2024-11-27 12:04:00.428300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST IDENTIFIER cid:4 cdw10:80000081 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.560 [2024-11-27 12:04:00.428336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 #23 NEW cov: 12404 ft: 14454 corp: 8/103b lim: 35 exec/s: 0 rss: 72Mb L: 9/19 MS: 1 ChangeByte- 00:08:31.819 [2024-11-27 12:04:00.479086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.819 [2024-11-27 12:04:00.479123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 [2024-11-27 12:04:00.479256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.819 [2024-11-27 12:04:00.479285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.819 [2024-11-27 12:04:00.479410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.819 [2024-11-27 12:04:00.479425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.819 #24 NEW cov: 12404 ft: 14723 corp: 9/130b lim: 35 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 PersAutoDict- DE: "\377\221\\\305\271^f\362"- 00:08:31.819 [2024-11-27 12:04:00.548618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.819 [2024-11-27 12:04:00.548650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 NEW_FUNC[1/1]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:31.819 #27 NEW cov: 12414 ft: 14773 corp: 10/140b lim: 35 exec/s: 0 rss: 72Mb L: 10/27 MS: 3 ShuffleBytes-CopyPart-CrossOver- 00:08:31.819 [2024-11-27 12:04:00.599108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.819 [2024-11-27 12:04:00.599140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 [2024-11-27 12:04:00.599262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.819 [2024-11-27 12:04:00.599288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.819 #28 NEW cov: 12414 ft: 14802 corp: 11/156b lim: 35 exec/s: 0 rss: 72Mb L: 16/27 MS: 1 EraseBytes- 00:08:31.819 [2024-11-27 12:04:00.669053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.820 [2024-11-27 12:04:00.669087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.820 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:31.820 #29 NEW cov: 12437 ft: 14830 corp: 12/164b lim: 35 exec/s: 0 rss: 72Mb L: 8/27 MS: 1 CrossOver- 00:08:32.079 [2024-11-27 12:04:00.718901] ctrlr.c:1783:nvmf_ctrlr_set_features_host_identifier: *ERROR*: Set Features - Host Identifier not allowed 00:08:32.079 [2024-11-27 12:04:00.719308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST IDENTIFIER cid:4 cdw10:00000081 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.719345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: COMMAND SEQUENCE ERROR (00/0c) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.079 NEW_FUNC[1/1]: 0x1362558 in nvmf_ctrlr_set_features_host_identifier /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1780 00:08:32.079 #30 NEW cov: 12446 ft: 14885 corp: 13/173b lim: 35 exec/s: 30 rss: 73Mb L: 9/27 MS: 1 ChangeByte- 00:08:32.079 [2024-11-27 12:04:00.800570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.800609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.800734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.800751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.800884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.800902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.801028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.801053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.801182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000e4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.801205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:32.079 #31 NEW cov: 12446 ft: 15294 corp: 14/208b lim: 35 exec/s: 31 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:32.079 [2024-11-27 12:04:00.869921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.869955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.870089] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.870110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.079 #32 NEW cov: 12446 ft: 15306 corp: 15/227b lim: 35 exec/s: 32 rss: 73Mb L: 19/35 MS: 1 CrossOver- 00:08:32.079 [2024-11-27 12:04:00.920405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.920437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.920577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.920596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.079 [2024-11-27 12:04:00.920740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.079 [2024-11-27 12:04:00.920758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.079 #33 NEW cov: 12446 ft: 15314 corp: 16/249b lim: 35 exec/s: 33 rss: 73Mb L: 22/35 MS: 1 InsertRepeatedBytes- 00:08:32.338 [2024-11-27 12:04:00.970033] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST IDENTIFIER cid:4 cdw10:80000081 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.338 [2024-11-27 12:04:00.970070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.338 #34 NEW cov: 12446 ft: 15329 corp: 17/258b lim: 35 exec/s: 34 rss: 73Mb L: 9/35 MS: 1 CMP- DE: "\001\000"- 00:08:32.338 [2024-11-27 12:04:01.020363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.338 [2024-11-27 12:04:01.020393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.338 [2024-11-27 12:04:01.020542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.338 [2024-11-27 12:04:01.020562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.338 #35 NEW cov: 12446 ft: 15350 corp: 18/274b lim: 35 exec/s: 35 rss: 73Mb L: 16/35 MS: 1 ChangeBit- 00:08:32.338 [2024-11-27 12:04:01.090907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.338 [2024-11-27 12:04:01.090944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.338 [2024-11-27 12:04:01.091084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.338 [2024-11-27 12:04:01.091108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.339 [2024-11-27 12:04:01.091236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.339 [2024-11-27 12:04:01.091253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.339 #36 NEW cov: 12446 ft: 15387 corp: 19/301b lim: 35 exec/s: 36 rss: 73Mb L: 27/35 MS: 1 ShuffleBytes- 00:08:32.339 [2024-11-27 12:04:01.140752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.339 [2024-11-27 12:04:01.140783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.339 [2024-11-27 12:04:01.140909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.339 [2024-11-27 12:04:01.140927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.339 #37 NEW cov: 12446 ft: 15416 corp: 20/317b lim: 35 exec/s: 37 rss: 73Mb L: 16/35 MS: 1 ShuffleBytes- 00:08:32.339 [2024-11-27 12:04:01.210963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.339 [2024-11-27 12:04:01.210993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.339 [2024-11-27 12:04:01.211136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.339 [2024-11-27 12:04:01.211162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.598 #38 NEW cov: 12446 ft: 15443 corp: 21/333b lim: 35 exec/s: 38 rss: 73Mb L: 16/35 MS: 1 CopyPart- 00:08:32.598 [2024-11-27 12:04:01.261383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.261420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.598 [2024-11-27 12:04:01.261558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000073 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.261575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.598 [2024-11-27 12:04:01.261716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.261739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.598 #39 NEW cov: 12446 ft: 15464 corp: 22/359b lim: 35 exec/s: 39 rss: 73Mb L: 26/35 MS: 1 InsertRepeatedBytes- 00:08:32.598 [2024-11-27 12:04:01.311422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.311456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.598 [2024-11-27 12:04:01.311589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.311619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.598 [2024-11-27 12:04:01.361450] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.361480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.598 [2024-11-27 12:04:01.361620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.361639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.598 #41 NEW cov: 12446 ft: 15474 corp: 23/379b lim: 35 exec/s: 41 rss: 73Mb L: 20/35 MS: 2 InsertByte-ShuffleBytes- 00:08:32.598 [2024-11-27 12:04:01.411396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.411428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.598 #42 NEW cov: 12446 ft: 15506 corp: 24/390b lim: 35 exec/s: 42 rss: 73Mb L: 11/35 MS: 1 InsertByte- 00:08:32.598 [2024-11-27 12:04:01.481899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.481930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.598 [2024-11-27 12:04:01.482074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.598 [2024-11-27 12:04:01.482095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.858 #43 NEW cov: 12446 ft: 15566 corp: 25/409b lim: 35 exec/s: 43 rss: 73Mb L: 19/35 MS: 1 ChangeBit- 00:08:32.858 [2024-11-27 12:04:01.531978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.532008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.858 [2024-11-27 12:04:01.532133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.532149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.858 [2024-11-27 12:04:01.602158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.602194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.858 [2024-11-27 12:04:01.602336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.602353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.858 #45 NEW cov: 12446 ft: 15577 corp: 26/425b lim: 35 exec/s: 45 rss: 73Mb L: 16/35 MS: 2 ChangeBinInt-ShuffleBytes- 00:08:32.858 [2024-11-27 12:04:01.652687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.652720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.858 [2024-11-27 12:04:01.652872] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.652896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.858 [2024-11-27 12:04:01.653023] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.653049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.858 #46 NEW cov: 12446 ft: 15606 corp: 27/449b lim: 35 exec/s: 46 rss: 73Mb L: 24/35 MS: 1 CopyPart- 00:08:32.858 [2024-11-27 12:04:01.702584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000091 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.702618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.858 [2024-11-27 12:04:01.702749] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.858 [2024-11-27 12:04:01.702775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.858 #47 NEW cov: 12446 ft: 15621 corp: 28/466b lim: 35 exec/s: 47 rss: 73Mb L: 17/35 MS: 1 InsertByte- 00:08:33.117 [2024-11-27 12:04:01.742984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.117 [2024-11-27 12:04:01.743021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.117 [2024-11-27 12:04:01.743145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000073 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.117 [2024-11-27 12:04:01.743163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.117 [2024-11-27 12:04:01.743296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.117 [2024-11-27 12:04:01.743315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.117 #48 NEW cov: 12446 ft: 15625 corp: 29/492b lim: 35 exec/s: 24 rss: 73Mb L: 26/35 MS: 1 ChangeBit- 00:08:33.117 #48 DONE cov: 12446 ft: 15625 corp: 29/492b lim: 35 exec/s: 24 rss: 73Mb 00:08:33.117 ###### Recommended dictionary. ###### 00:08:33.117 "\377\221\\\305\271^f\362" # Uses: 1 00:08:33.117 "\001\000" # Uses: 0 00:08:33.117 ###### End of recommended dictionary. ###### 00:08:33.117 Done 48 runs in 2 second(s) 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:33.117 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:33.118 12:04:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:33.118 [2024-11-27 12:04:01.945818] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:33.118 [2024-11-27 12:04:01.945884] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1728822 ] 00:08:33.376 [2024-11-27 12:04:02.121934] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.376 [2024-11-27 12:04:02.143981] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.376 [2024-11-27 12:04:02.196263] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.376 [2024-11-27 12:04:02.212631] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:33.376 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.376 INFO: Seed: 1618652693 00:08:33.376 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:33.376 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:33.376 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:33.376 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.376 #2 INITED exec/s: 0 rss: 65Mb 00:08:33.376 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.376 This may also happen if the target rejected all inputs we tried so far 00:08:33.894 NEW_FUNC[1/701]: 0x46ecc8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:33.894 NEW_FUNC[2/701]: 0x48b638 in feat_error_recover /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:304 00:08:33.894 #22 NEW cov: 12067 ft: 12066 corp: 2/8b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 5 ChangeBinInt-ChangeBit-InsertRepeatedBytes-InsertByte-CMP- DE: "\000\\"- 00:08:33.894 [2024-11-27 12:04:02.619756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.894 [2024-11-27 12:04:02.619800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.894 NEW_FUNC[1/14]: 0x19256f8 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:33.894 NEW_FUNC[2/14]: 0x1925938 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:33.894 #26 NEW cov: 12311 ft: 12961 corp: 3/15b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 4 ChangeByte-ChangeBit-CrossOver-InsertByte- 00:08:33.894 #32 NEW cov: 12317 ft: 13150 corp: 4/27b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:08:33.894 #33 NEW cov: 12402 ft: 13375 corp: 5/34b lim: 35 exec/s: 0 rss: 72Mb L: 7/12 MS: 1 EraseBytes- 00:08:34.151 [2024-11-27 12:04:02.810336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.151 [2024-11-27 12:04:02.810369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.151 #34 NEW cov: 12402 ft: 13465 corp: 6/41b lim: 35 exec/s: 0 rss: 72Mb L: 7/12 MS: 1 ChangeBinInt- 00:08:34.151 [2024-11-27 12:04:02.880726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.151 [2024-11-27 12:04:02.880757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.151 [2024-11-27 12:04:02.880891] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.151 [2024-11-27 12:04:02.880910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.151 #35 NEW cov: 12402 ft: 13839 corp: 7/56b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:34.151 [2024-11-27 12:04:02.930902] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.151 [2024-11-27 12:04:02.930931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.151 [2024-11-27 12:04:02.931053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.151 [2024-11-27 12:04:02.931071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.151 #36 NEW cov: 12402 ft: 14096 corp: 8/71b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 CMP- DE: "\246<\013D\357\177\000\000"- 00:08:34.151 [2024-11-27 12:04:03.000826] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.152 [2024-11-27 12:04:03.000855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.409 #37 NEW cov: 12402 ft: 14148 corp: 9/78b lim: 35 exec/s: 0 rss: 73Mb L: 7/15 MS: 1 CrossOver- 00:08:34.409 [2024-11-27 12:04:03.071528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.409 [2024-11-27 12:04:03.071557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.409 [2024-11-27 12:04:03.071692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.409 [2024-11-27 12:04:03.071709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.409 [2024-11-27 12:04:03.071848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.409 [2024-11-27 12:04:03.071866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.409 #38 NEW cov: 12402 ft: 14418 corp: 10/105b lim: 35 exec/s: 0 rss: 73Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:34.409 [2024-11-27 12:04:03.141422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.409 [2024-11-27 12:04:03.141451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.409 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:34.409 #39 NEW cov: 12425 ft: 14526 corp: 11/112b lim: 35 exec/s: 0 rss: 73Mb L: 7/27 MS: 1 CrossOver- 00:08:34.409 #40 NEW cov: 12425 ft: 14627 corp: 12/125b lim: 35 exec/s: 0 rss: 73Mb L: 13/27 MS: 1 InsertByte- 00:08:34.409 [2024-11-27 12:04:03.241647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.409 [2024-11-27 12:04:03.241678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.409 #42 NEW cov: 12425 ft: 14704 corp: 13/136b lim: 35 exec/s: 42 rss: 73Mb L: 11/27 MS: 2 CrossOver-CrossOver- 00:08:34.667 [2024-11-27 12:04:03.311847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.667 [2024-11-27 12:04:03.311875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.667 #43 NEW cov: 12425 ft: 14752 corp: 14/149b lim: 35 exec/s: 43 rss: 73Mb L: 13/27 MS: 1 CrossOver- 00:08:34.667 [2024-11-27 12:04:03.382063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.667 [2024-11-27 12:04:03.382091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.667 #44 NEW cov: 12425 ft: 14762 corp: 15/156b lim: 35 exec/s: 44 rss: 73Mb L: 7/27 MS: 1 CMP- DE: "F\000\000\000"- 00:08:34.667 [2024-11-27 12:04:03.432243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.667 [2024-11-27 12:04:03.432274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.667 #45 NEW cov: 12425 ft: 14803 corp: 16/167b lim: 35 exec/s: 45 rss: 73Mb L: 11/27 MS: 1 PersAutoDict- DE: "F\000\000\000"- 00:08:34.667 [2024-11-27 12:04:03.502445] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.667 [2024-11-27 12:04:03.502477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.667 #46 NEW cov: 12425 ft: 14851 corp: 17/178b lim: 35 exec/s: 46 rss: 73Mb L: 11/27 MS: 1 ChangeBinInt- 00:08:34.925 #47 NEW cov: 12425 ft: 14901 corp: 18/191b lim: 35 exec/s: 47 rss: 73Mb L: 13/27 MS: 1 ShuffleBytes- 00:08:34.925 [2024-11-27 12:04:03.643160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.925 [2024-11-27 12:04:03.643191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.925 [2024-11-27 12:04:03.643322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.925 [2024-11-27 12:04:03.643340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.925 #48 NEW cov: 12425 ft: 14955 corp: 19/206b lim: 35 exec/s: 48 rss: 73Mb L: 15/27 MS: 1 ChangeByte- 00:08:34.925 [2024-11-27 12:04:03.713522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.925 [2024-11-27 12:04:03.713553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.925 [2024-11-27 12:04:03.713697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.925 [2024-11-27 12:04:03.713715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.925 [2024-11-27 12:04:03.713853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.925 [2024-11-27 12:04:03.713871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.925 #51 NEW cov: 12425 ft: 14971 corp: 20/232b lim: 35 exec/s: 51 rss: 73Mb L: 26/27 MS: 3 EraseBytes-EraseBytes-InsertRepeatedBytes- 00:08:34.925 [2024-11-27 12:04:03.763257] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000025c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.925 [2024-11-27 12:04:03.763288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.925 #54 NEW cov: 12425 ft: 15022 corp: 21/245b lim: 35 exec/s: 54 rss: 73Mb L: 13/27 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:35.185 [2024-11-27 12:04:03.813867] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.813898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:03.814016] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.814034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:03.814169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.814186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.185 #55 NEW cov: 12425 ft: 15036 corp: 22/271b lim: 35 exec/s: 55 rss: 73Mb L: 26/27 MS: 1 ChangeBit- 00:08:35.185 [2024-11-27 12:04:03.884393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.884422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:03.884561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.884580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:03.884727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.884745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:03.884876] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.884893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.185 #56 NEW cov: 12425 ft: 15514 corp: 23/303b lim: 35 exec/s: 56 rss: 73Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:35.185 [2024-11-27 12:04:03.933977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.934007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:03.934145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000b3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:03.934163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.185 #57 NEW cov: 12425 ft: 15527 corp: 24/317b lim: 35 exec/s: 57 rss: 74Mb L: 14/32 MS: 1 InsertByte- 00:08:35.185 [2024-11-27 12:04:04.004649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:04.004678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:04.004821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:04.004843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:04.004989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:04.005008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.185 [2024-11-27 12:04:04.005147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000252 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.185 [2024-11-27 12:04:04.005167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.185 #58 NEW cov: 12425 ft: 15545 corp: 25/349b lim: 35 exec/s: 58 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:35.444 [2024-11-27 12:04:04.074281] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.444 [2024-11-27 12:04:04.074310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.444 #59 NEW cov: 12425 ft: 15566 corp: 26/357b lim: 35 exec/s: 59 rss: 74Mb L: 8/32 MS: 1 InsertByte- 00:08:35.444 [2024-11-27 12:04:04.144363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.444 [2024-11-27 12:04:04.144391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.444 #60 NEW cov: 12425 ft: 15585 corp: 27/368b lim: 35 exec/s: 60 rss: 74Mb L: 11/32 MS: 1 ChangeByte- 00:08:35.444 [2024-11-27 12:04:04.215059] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000224 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.444 [2024-11-27 12:04:04.215087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.444 [2024-11-27 12:04:04.215224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.444 [2024-11-27 12:04:04.215242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.444 [2024-11-27 12:04:04.215376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.444 [2024-11-27 12:04:04.215394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.444 #61 NEW cov: 12425 ft: 15622 corp: 28/394b lim: 35 exec/s: 30 rss: 74Mb L: 26/32 MS: 1 ChangeByte- 00:08:35.444 #61 DONE cov: 12425 ft: 15622 corp: 28/394b lim: 35 exec/s: 30 rss: 74Mb 00:08:35.444 ###### Recommended dictionary. ###### 00:08:35.444 "\000\\" # Uses: 1 00:08:35.444 "\246<\013D\357\177\000\000" # Uses: 0 00:08:35.444 "F\000\000\000" # Uses: 1 00:08:35.444 ###### End of recommended dictionary. ###### 00:08:35.444 Done 61 runs in 2 second(s) 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:35.703 12:04:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:35.703 [2024-11-27 12:04:04.418835] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:35.703 [2024-11-27 12:04:04.418902] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1729353 ] 00:08:35.962 [2024-11-27 12:04:04.595174] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.962 [2024-11-27 12:04:04.618126] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.962 [2024-11-27 12:04:04.670751] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.962 [2024-11-27 12:04:04.687065] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:35.962 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.962 INFO: Seed: 4092654366 00:08:35.962 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:35.962 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:35.962 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:35.962 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.962 #2 INITED exec/s: 0 rss: 65Mb 00:08:35.962 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.962 This may also happen if the target rejected all inputs we tried so far 00:08:35.962 [2024-11-27 12:04:04.732684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.962 [2024-11-27 12:04:04.732714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.962 [2024-11-27 12:04:04.732755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.962 [2024-11-27 12:04:04.732770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.962 [2024-11-27 12:04:04.732820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.962 [2024-11-27 12:04:04.732834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.962 [2024-11-27 12:04:04.732886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.962 [2024-11-27 12:04:04.732901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.221 NEW_FUNC[1/715]: 0x470188 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:36.221 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.221 #5 NEW cov: 12275 ft: 12274 corp: 2/95b lim: 105 exec/s: 0 rss: 72Mb L: 94/94 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:36.221 [2024-11-27 12:04:05.063117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.221 [2024-11-27 12:04:05.063151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.221 #12 NEW cov: 12388 ft: 13463 corp: 3/130b lim: 105 exec/s: 0 rss: 72Mb L: 35/94 MS: 2 ShuffleBytes-CrossOver- 00:08:36.221 [2024-11-27 12:04:05.103309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.221 [2024-11-27 12:04:05.103337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.221 [2024-11-27 12:04:05.103375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.221 [2024-11-27 12:04:05.103392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.480 #16 NEW cov: 12394 ft: 14053 corp: 4/188b lim: 105 exec/s: 0 rss: 72Mb L: 58/94 MS: 4 ChangeBit-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:36.480 [2024-11-27 12:04:05.143417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.143446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.480 [2024-11-27 12:04:05.143485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.143500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.480 #17 NEW cov: 12479 ft: 14255 corp: 5/246b lim: 105 exec/s: 0 rss: 72Mb L: 58/94 MS: 1 ChangeBinInt- 00:08:36.480 [2024-11-27 12:04:05.203704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.203732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.480 [2024-11-27 12:04:05.203780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6293595036912670551 len:22277 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.203795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.480 [2024-11-27 12:04:05.203847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:289360691352306692 len:1029 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.203862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.480 #18 NEW cov: 12479 ft: 14608 corp: 6/316b lim: 105 exec/s: 0 rss: 72Mb L: 70/94 MS: 1 InsertRepeatedBytes- 00:08:36.480 [2024-11-27 12:04:05.263785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.263813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.480 [2024-11-27 12:04:05.263871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.263886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.480 #19 NEW cov: 12479 ft: 14689 corp: 7/374b lim: 105 exec/s: 0 rss: 72Mb L: 58/94 MS: 1 CopyPart- 00:08:36.480 [2024-11-27 12:04:05.323892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.323920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.480 [2024-11-27 12:04:05.323960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.480 [2024-11-27 12:04:05.323975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.739 #20 NEW cov: 12479 ft: 14746 corp: 8/433b lim: 105 exec/s: 0 rss: 72Mb L: 59/94 MS: 1 InsertByte- 00:08:36.739 [2024-11-27 12:04:05.384088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.384115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.384154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.384169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.739 #21 NEW cov: 12479 ft: 14804 corp: 9/495b lim: 105 exec/s: 0 rss: 72Mb L: 62/94 MS: 1 CopyPart- 00:08:36.739 [2024-11-27 12:04:05.424203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.424231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.424269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.424285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.739 #22 NEW cov: 12479 ft: 14848 corp: 10/554b lim: 105 exec/s: 0 rss: 73Mb L: 59/94 MS: 1 ShuffleBytes- 00:08:36.739 [2024-11-27 12:04:05.484370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.484397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.484435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.484451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.739 #23 NEW cov: 12479 ft: 14924 corp: 11/612b lim: 105 exec/s: 0 rss: 73Mb L: 58/94 MS: 1 ChangeBit- 00:08:36.739 [2024-11-27 12:04:05.524454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.524480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.524535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.524557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.739 #24 NEW cov: 12479 ft: 14955 corp: 12/672b lim: 105 exec/s: 0 rss: 73Mb L: 60/94 MS: 1 CrossOver- 00:08:36.739 [2024-11-27 12:04:05.564882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.564910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.564977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.564993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.565049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.565063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.739 [2024-11-27 12:04:05.565120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744069431361535 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.739 [2024-11-27 12:04:05.565136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.739 #25 NEW cov: 12479 ft: 14976 corp: 13/772b lim: 105 exec/s: 0 rss: 73Mb L: 100/100 MS: 1 CopyPart- 00:08:36.998 [2024-11-27 12:04:05.624820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.624849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.624892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069582356479 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.624907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.999 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:36.999 #26 NEW cov: 12502 ft: 15083 corp: 14/832b lim: 105 exec/s: 0 rss: 73Mb L: 60/100 MS: 1 ChangeBinInt- 00:08:36.999 [2024-11-27 12:04:05.664917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.664944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.664988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073698934783 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.665002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.999 #27 NEW cov: 12502 ft: 15110 corp: 15/891b lim: 105 exec/s: 0 rss: 73Mb L: 59/100 MS: 1 InsertByte- 00:08:36.999 [2024-11-27 12:04:05.705075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551576 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.705101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.705144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.705163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.705217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.705233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.999 #28 NEW cov: 12502 ft: 15173 corp: 16/954b lim: 105 exec/s: 28 rss: 73Mb L: 63/100 MS: 1 InsertByte- 00:08:36.999 [2024-11-27 12:04:05.765208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.765238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.765287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.765304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.999 #29 NEW cov: 12502 ft: 15194 corp: 17/1000b lim: 105 exec/s: 29 rss: 73Mb L: 46/100 MS: 1 EraseBytes- 00:08:36.999 [2024-11-27 12:04:05.805269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.805297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.805336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.805351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.999 #30 NEW cov: 12502 ft: 15221 corp: 18/1058b lim: 105 exec/s: 30 rss: 73Mb L: 58/100 MS: 1 ChangeBinInt- 00:08:36.999 [2024-11-27 12:04:05.845422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.845449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.999 [2024-11-27 12:04:05.845488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.999 [2024-11-27 12:04:05.845504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.999 #31 NEW cov: 12502 ft: 15260 corp: 19/1118b lim: 105 exec/s: 31 rss: 73Mb L: 60/100 MS: 1 ChangeByte- 00:08:37.258 [2024-11-27 12:04:05.885792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:05.885820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:05.885878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069582356479 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:05.885894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:05.885948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073692842495 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:05.885965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:05.886027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:05.886043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.258 #32 NEW cov: 12502 ft: 15278 corp: 20/1212b lim: 105 exec/s: 32 rss: 73Mb L: 94/100 MS: 1 CopyPart- 00:08:37.258 [2024-11-27 12:04:05.945702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:05.945730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:05.945769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:05.945785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.258 #33 NEW cov: 12502 ft: 15287 corp: 21/1270b lim: 105 exec/s: 33 rss: 73Mb L: 58/100 MS: 1 ShuffleBytes- 00:08:37.258 [2024-11-27 12:04:06.006088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.006116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.006171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.006186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.006241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6299787486400304983 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.006256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.006309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.006325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.258 #34 NEW cov: 12502 ft: 15292 corp: 22/1365b lim: 105 exec/s: 34 rss: 73Mb L: 95/100 MS: 1 InsertByte- 00:08:37.258 [2024-11-27 12:04:06.066220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.066246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.066293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071873663743 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.066309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.066361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.066375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.258 #35 NEW cov: 12502 ft: 15368 corp: 23/1438b lim: 105 exec/s: 35 rss: 73Mb L: 73/100 MS: 1 InsertRepeatedBytes- 00:08:37.258 [2024-11-27 12:04:06.106216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.106247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.106281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.106297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.258 [2024-11-27 12:04:06.106351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.258 [2024-11-27 12:04:06.106366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.258 #36 NEW cov: 12502 ft: 15380 corp: 24/1508b lim: 105 exec/s: 36 rss: 73Mb L: 70/100 MS: 1 CrossOver- 00:08:37.517 [2024-11-27 12:04:06.146363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.146390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.146428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073698934783 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.146443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.146499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.146515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.517 #37 NEW cov: 12502 ft: 15403 corp: 25/1571b lim: 105 exec/s: 37 rss: 73Mb L: 63/100 MS: 1 CMP- DE: "\001\000\000\006"- 00:08:37.517 [2024-11-27 12:04:06.206409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18401989552412557311 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.206438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.206487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.206503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.517 #38 NEW cov: 12502 ft: 15435 corp: 26/1617b lim: 105 exec/s: 38 rss: 73Mb L: 46/100 MS: 1 ChangeByte- 00:08:37.517 [2024-11-27 12:04:06.266610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.266637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.266674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.266689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.517 #39 NEW cov: 12502 ft: 15453 corp: 27/1676b lim: 105 exec/s: 39 rss: 74Mb L: 59/100 MS: 1 ShuffleBytes- 00:08:37.517 [2024-11-27 12:04:06.326896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13835058055282163672 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.326922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.326964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.326979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.327033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.327049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.517 #40 NEW cov: 12502 ft: 15466 corp: 28/1739b lim: 105 exec/s: 40 rss: 74Mb L: 63/100 MS: 1 ChangeBit- 00:08:37.517 [2024-11-27 12:04:06.386903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18374967958943301631 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.386930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.517 [2024-11-27 12:04:06.386970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.517 [2024-11-27 12:04:06.386985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.776 #41 NEW cov: 12502 ft: 15474 corp: 29/1801b lim: 105 exec/s: 41 rss: 74Mb L: 62/100 MS: 1 PersAutoDict- DE: "\001\000\000\006"- 00:08:37.776 [2024-11-27 12:04:06.427396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.776 [2024-11-27 12:04:06.427423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.776 [2024-11-27 12:04:06.427480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65282 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.776 [2024-11-27 12:04:06.427496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.776 [2024-11-27 12:04:06.427548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.427564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.427635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744069431361535 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.427651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.427707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:10304235945520434830 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.427722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:37.777 #42 NEW cov: 12502 ft: 15526 corp: 30/1906b lim: 105 exec/s: 42 rss: 74Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:08:37.777 [2024-11-27 12:04:06.487185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.487212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.487253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.487269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.777 #43 NEW cov: 12502 ft: 15535 corp: 31/1959b lim: 105 exec/s: 43 rss: 74Mb L: 53/105 MS: 1 EraseBytes- 00:08:37.777 [2024-11-27 12:04:06.527559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18230570200674074623 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.527587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.527649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069582356479 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.527665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.527719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073692842495 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.527734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.527789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.527804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.777 #44 NEW cov: 12502 ft: 15559 corp: 32/2053b lim: 105 exec/s: 44 rss: 74Mb L: 94/105 MS: 1 ChangeByte- 00:08:37.777 [2024-11-27 12:04:06.587505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.587531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.587570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.587584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.777 #45 NEW cov: 12502 ft: 15573 corp: 33/2112b lim: 105 exec/s: 45 rss: 74Mb L: 59/105 MS: 1 ChangeBinInt- 00:08:37.777 [2024-11-27 12:04:06.647776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.647803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.647849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.647864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.777 [2024-11-27 12:04:06.647918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.777 [2024-11-27 12:04:06.647933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.036 #46 NEW cov: 12502 ft: 15579 corp: 34/2178b lim: 105 exec/s: 46 rss: 74Mb L: 66/105 MS: 1 CrossOver- 00:08:38.036 [2024-11-27 12:04:06.707887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742982787858431 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.036 [2024-11-27 12:04:06.707916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.036 [2024-11-27 12:04:06.707968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069582356479 len:257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.036 [2024-11-27 12:04:06.707987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.036 #47 NEW cov: 12502 ft: 15597 corp: 35/2232b lim: 105 exec/s: 23 rss: 74Mb L: 54/105 MS: 1 EraseBytes- 00:08:38.036 #47 DONE cov: 12502 ft: 15597 corp: 35/2232b lim: 105 exec/s: 23 rss: 74Mb 00:08:38.036 ###### Recommended dictionary. ###### 00:08:38.036 "\001\000\000\006" # Uses: 1 00:08:38.036 ###### End of recommended dictionary. ###### 00:08:38.036 Done 47 runs in 2 second(s) 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:38.036 12:04:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:38.036 [2024-11-27 12:04:06.895913] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:38.036 [2024-11-27 12:04:06.895986] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1729735 ] 00:08:38.295 [2024-11-27 12:04:07.077796] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.295 [2024-11-27 12:04:07.100885] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.295 [2024-11-27 12:04:07.153412] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.295 [2024-11-27 12:04:07.169786] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:38.554 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.554 INFO: Seed: 2278689964 00:08:38.554 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:38.554 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:38.554 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:38.554 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.554 #2 INITED exec/s: 0 rss: 66Mb 00:08:38.554 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.554 This may also happen if the target rejected all inputs we tried so far 00:08:38.554 [2024-11-27 12:04:07.217208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.554 [2024-11-27 12:04:07.217237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.554 [2024-11-27 12:04:07.217297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.554 [2024-11-27 12:04:07.217313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.813 NEW_FUNC[1/716]: 0x473508 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:38.813 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.813 #7 NEW cov: 12289 ft: 12287 corp: 2/59b lim: 120 exec/s: 0 rss: 72Mb L: 58/58 MS: 5 ShuffleBytes-InsertByte-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:38.813 [2024-11-27 12:04:07.559174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.813 [2024-11-27 12:04:07.559233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.813 [2024-11-27 12:04:07.559372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18437736874454810623 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.813 [2024-11-27 12:04:07.559406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.813 #13 NEW cov: 12409 ft: 12998 corp: 3/117b lim: 120 exec/s: 0 rss: 72Mb L: 58/58 MS: 1 ChangeBit- 00:08:38.813 [2024-11-27 12:04:07.629348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.813 [2024-11-27 12:04:07.629377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.813 [2024-11-27 12:04:07.629496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.813 [2024-11-27 12:04:07.629521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.813 #14 NEW cov: 12415 ft: 13167 corp: 4/175b lim: 120 exec/s: 0 rss: 72Mb L: 58/58 MS: 1 CopyPart- 00:08:38.813 [2024-11-27 12:04:07.679280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.813 [2024-11-27 12:04:07.679313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.813 [2024-11-27 12:04:07.679422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.813 [2024-11-27 12:04:07.679443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.072 #15 NEW cov: 12500 ft: 13463 corp: 5/233b lim: 120 exec/s: 0 rss: 72Mb L: 58/58 MS: 1 ShuffleBytes- 00:08:39.072 [2024-11-27 12:04:07.750211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.750239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.750350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.750378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.750490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.750516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.750635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.750665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.072 #16 NEW cov: 12500 ft: 14042 corp: 6/352b lim: 120 exec/s: 0 rss: 72Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:08:39.072 [2024-11-27 12:04:07.799746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.799782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.799896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.799920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.072 #17 NEW cov: 12500 ft: 14269 corp: 7/410b lim: 120 exec/s: 0 rss: 72Mb L: 58/119 MS: 1 CopyPart- 00:08:39.072 [2024-11-27 12:04:07.849895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.849930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.850055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.850079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.072 #18 NEW cov: 12500 ft: 14377 corp: 8/468b lim: 120 exec/s: 0 rss: 72Mb L: 58/119 MS: 1 ShuffleBytes- 00:08:39.072 [2024-11-27 12:04:07.900607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.900638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.900721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.900744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.900858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.900883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.901002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.901025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.072 #19 NEW cov: 12500 ft: 14439 corp: 9/565b lim: 120 exec/s: 0 rss: 72Mb L: 97/119 MS: 1 InsertRepeatedBytes- 00:08:39.072 [2024-11-27 12:04:07.950157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.950194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.072 [2024-11-27 12:04:07.950315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.072 [2024-11-27 12:04:07.950334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.331 #20 NEW cov: 12500 ft: 14524 corp: 10/623b lim: 120 exec/s: 0 rss: 72Mb L: 58/119 MS: 1 ChangeByte- 00:08:39.331 [2024-11-27 12:04:08.020398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.331 [2024-11-27 12:04:08.020432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.331 [2024-11-27 12:04:08.020560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.331 [2024-11-27 12:04:08.020581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.331 #21 NEW cov: 12500 ft: 14571 corp: 11/681b lim: 120 exec/s: 0 rss: 72Mb L: 58/119 MS: 1 ChangeByte- 00:08:39.331 [2024-11-27 12:04:08.090377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:799402123888954367 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.331 [2024-11-27 12:04:08.090413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.331 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:39.331 #22 NEW cov: 12523 ft: 15439 corp: 12/711b lim: 120 exec/s: 0 rss: 72Mb L: 30/119 MS: 1 CrossOver- 00:08:39.331 [2024-11-27 12:04:08.160572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.331 [2024-11-27 12:04:08.160608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.331 #28 NEW cov: 12523 ft: 15527 corp: 13/753b lim: 120 exec/s: 0 rss: 72Mb L: 42/119 MS: 1 EraseBytes- 00:08:39.331 [2024-11-27 12:04:08.211303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.331 [2024-11-27 12:04:08.211339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.332 [2024-11-27 12:04:08.211455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.332 [2024-11-27 12:04:08.211479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.332 [2024-11-27 12:04:08.211603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.332 [2024-11-27 12:04:08.211629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.591 #29 NEW cov: 12523 ft: 15834 corp: 14/836b lim: 120 exec/s: 29 rss: 73Mb L: 83/119 MS: 1 CopyPart- 00:08:39.591 [2024-11-27 12:04:08.281209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.281244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.591 [2024-11-27 12:04:08.281379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.281404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.591 #30 NEW cov: 12523 ft: 15842 corp: 15/885b lim: 120 exec/s: 30 rss: 73Mb L: 49/119 MS: 1 InsertRepeatedBytes- 00:08:39.591 [2024-11-27 12:04:08.352065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.352099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.591 [2024-11-27 12:04:08.352173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.352194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.591 [2024-11-27 12:04:08.352305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.352330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.591 [2024-11-27 12:04:08.352452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.352478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.591 #31 NEW cov: 12523 ft: 15858 corp: 16/1004b lim: 120 exec/s: 31 rss: 73Mb L: 119/119 MS: 1 ChangeByte- 00:08:39.591 [2024-11-27 12:04:08.421667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.421702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.591 [2024-11-27 12:04:08.421827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.421851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.591 #32 NEW cov: 12523 ft: 15870 corp: 17/1062b lim: 120 exec/s: 32 rss: 73Mb L: 58/119 MS: 1 ChangeBit- 00:08:39.591 [2024-11-27 12:04:08.471794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.471831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.591 [2024-11-27 12:04:08.471957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.591 [2024-11-27 12:04:08.471982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.850 #33 NEW cov: 12523 ft: 15882 corp: 18/1120b lim: 120 exec/s: 33 rss: 73Mb L: 58/119 MS: 1 ShuffleBytes- 00:08:39.850 [2024-11-27 12:04:08.522575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.522619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.522722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.522749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.522877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.522903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.523026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.523046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.850 #34 NEW cov: 12523 ft: 15910 corp: 19/1217b lim: 120 exec/s: 34 rss: 73Mb L: 97/119 MS: 1 CopyPart- 00:08:39.850 [2024-11-27 12:04:08.592173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.592210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.592329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.592358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.850 #36 NEW cov: 12523 ft: 15919 corp: 20/1282b lim: 120 exec/s: 36 rss: 73Mb L: 65/119 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:39.850 [2024-11-27 12:04:08.642514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.642552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.642681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.642707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.642841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.642866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.850 #37 NEW cov: 12523 ft: 15935 corp: 21/1376b lim: 120 exec/s: 37 rss: 73Mb L: 94/119 MS: 1 CopyPart- 00:08:39.850 [2024-11-27 12:04:08.712842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.712874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.713001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.713029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.850 [2024-11-27 12:04:08.713153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.850 [2024-11-27 12:04:08.713176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.110 #38 NEW cov: 12523 ft: 15957 corp: 22/1461b lim: 120 exec/s: 38 rss: 73Mb L: 85/119 MS: 1 CopyPart- 00:08:40.110 [2024-11-27 12:04:08.783260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.783296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.783400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.783427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.783536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.783564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.783692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.783717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.110 #39 NEW cov: 12523 ft: 15973 corp: 23/1573b lim: 120 exec/s: 39 rss: 73Mb L: 112/119 MS: 1 CopyPart- 00:08:40.110 [2024-11-27 12:04:08.852809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.852845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.852977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.853000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.110 #40 NEW cov: 12523 ft: 15978 corp: 24/1631b lim: 120 exec/s: 40 rss: 73Mb L: 58/119 MS: 1 ChangeByte- 00:08:40.110 [2024-11-27 12:04:08.923629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.923665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.923737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.923758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.923877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.923899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.924018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.924038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.110 #41 NEW cov: 12523 ft: 16014 corp: 25/1728b lim: 120 exec/s: 41 rss: 73Mb L: 97/119 MS: 1 ChangeBinInt- 00:08:40.110 [2024-11-27 12:04:08.973580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.973620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.973731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.973750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.110 [2024-11-27 12:04:08.973874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.110 [2024-11-27 12:04:08.973897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.369 #42 NEW cov: 12523 ft: 16039 corp: 26/1811b lim: 120 exec/s: 42 rss: 73Mb L: 83/119 MS: 1 CrossOver- 00:08:40.369 [2024-11-27 12:04:09.023997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.369 [2024-11-27 12:04:09.024031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.369 [2024-11-27 12:04:09.024119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.369 [2024-11-27 12:04:09.024144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.369 [2024-11-27 12:04:09.024267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.024292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.370 [2024-11-27 12:04:09.024415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.024439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.370 #43 NEW cov: 12523 ft: 16047 corp: 27/1930b lim: 120 exec/s: 43 rss: 73Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:08:40.370 [2024-11-27 12:04:09.073887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.073923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.370 [2024-11-27 12:04:09.074019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.074041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.370 [2024-11-27 12:04:09.074161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.074185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.370 #44 NEW cov: 12523 ft: 16061 corp: 28/2025b lim: 120 exec/s: 44 rss: 73Mb L: 95/119 MS: 1 InsertByte- 00:08:40.370 [2024-11-27 12:04:09.144412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069600709631 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.144447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.370 [2024-11-27 12:04:09.144549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.144573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.370 [2024-11-27 12:04:09.144691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:65280 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.144714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.370 [2024-11-27 12:04:09.144827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.144848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.370 #45 NEW cov: 12523 ft: 16070 corp: 29/2122b lim: 120 exec/s: 45 rss: 73Mb L: 97/119 MS: 1 ShuffleBytes- 00:08:40.370 [2024-11-27 12:04:09.193713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:799402123888954367 len:30720 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.370 [2024-11-27 12:04:09.193742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.370 #46 NEW cov: 12523 ft: 16104 corp: 30/2152b lim: 120 exec/s: 23 rss: 74Mb L: 30/119 MS: 1 ChangeByte- 00:08:40.370 #46 DONE cov: 12523 ft: 16104 corp: 30/2152b lim: 120 exec/s: 23 rss: 74Mb 00:08:40.370 Done 46 runs in 2 second(s) 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:40.628 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:40.629 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.629 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.629 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:40.629 12:04:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:40.629 [2024-11-27 12:04:09.401225] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:40.629 [2024-11-27 12:04:09.401292] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1730173 ] 00:08:40.887 [2024-11-27 12:04:09.578750] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.887 [2024-11-27 12:04:09.600827] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.887 [2024-11-27 12:04:09.653535] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.887 [2024-11-27 12:04:09.669890] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:40.887 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.887 INFO: Seed: 486718648 00:08:40.887 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:40.887 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:40.887 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:40.887 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.887 #2 INITED exec/s: 0 rss: 65Mb 00:08:40.887 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.887 This may also happen if the target rejected all inputs we tried so far 00:08:40.887 [2024-11-27 12:04:09.725170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:40.887 [2024-11-27 12:04:09.725199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.887 [2024-11-27 12:04:09.725267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:40.887 [2024-11-27 12:04:09.725283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.146 NEW_FUNC[1/714]: 0x476df8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:41.146 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:41.146 #5 NEW cov: 12239 ft: 12238 corp: 2/58b lim: 100 exec/s: 0 rss: 72Mb L: 57/57 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:41.404 [2024-11-27 12:04:10.046014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.404 [2024-11-27 12:04:10.046053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.404 [2024-11-27 12:04:10.046121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.404 [2024-11-27 12:04:10.046137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.404 #6 NEW cov: 12352 ft: 12861 corp: 3/115b lim: 100 exec/s: 0 rss: 72Mb L: 57/57 MS: 1 ChangeBit- 00:08:41.404 [2024-11-27 12:04:10.106006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.404 [2024-11-27 12:04:10.106036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.404 #7 NEW cov: 12358 ft: 13344 corp: 4/144b lim: 100 exec/s: 0 rss: 72Mb L: 29/57 MS: 1 InsertRepeatedBytes- 00:08:41.404 [2024-11-27 12:04:10.146213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.404 [2024-11-27 12:04:10.146239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.404 [2024-11-27 12:04:10.146274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.404 [2024-11-27 12:04:10.146288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.404 #8 NEW cov: 12443 ft: 13582 corp: 5/201b lim: 100 exec/s: 0 rss: 72Mb L: 57/57 MS: 1 ShuffleBytes- 00:08:41.405 [2024-11-27 12:04:10.186325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.405 [2024-11-27 12:04:10.186352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.405 [2024-11-27 12:04:10.186400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.405 [2024-11-27 12:04:10.186418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.405 #9 NEW cov: 12443 ft: 13777 corp: 6/259b lim: 100 exec/s: 0 rss: 72Mb L: 58/58 MS: 1 InsertByte- 00:08:41.405 [2024-11-27 12:04:10.226422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.405 [2024-11-27 12:04:10.226449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.405 [2024-11-27 12:04:10.226494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.405 [2024-11-27 12:04:10.226509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.405 #10 NEW cov: 12443 ft: 13879 corp: 7/317b lim: 100 exec/s: 0 rss: 72Mb L: 58/58 MS: 1 ChangeBit- 00:08:41.405 [2024-11-27 12:04:10.286457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.405 [2024-11-27 12:04:10.286485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.664 #11 NEW cov: 12443 ft: 13963 corp: 8/354b lim: 100 exec/s: 0 rss: 72Mb L: 37/58 MS: 1 CMP- DE: "\351\007_N\313\\\222\000"- 00:08:41.664 [2024-11-27 12:04:10.346983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.664 [2024-11-27 12:04:10.347010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.664 [2024-11-27 12:04:10.347059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.664 [2024-11-27 12:04:10.347072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.664 [2024-11-27 12:04:10.347121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.664 [2024-11-27 12:04:10.347135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.664 [2024-11-27 12:04:10.347186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.664 [2024-11-27 12:04:10.347200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.664 #12 NEW cov: 12443 ft: 14329 corp: 9/445b lim: 100 exec/s: 0 rss: 72Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:08:41.664 [2024-11-27 12:04:10.386736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.664 [2024-11-27 12:04:10.386770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.664 #13 NEW cov: 12443 ft: 14418 corp: 10/483b lim: 100 exec/s: 0 rss: 73Mb L: 38/91 MS: 1 InsertByte- 00:08:41.664 [2024-11-27 12:04:10.446947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.664 [2024-11-27 12:04:10.446974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.664 #14 NEW cov: 12443 ft: 14471 corp: 11/521b lim: 100 exec/s: 0 rss: 73Mb L: 38/91 MS: 1 InsertByte- 00:08:41.664 [2024-11-27 12:04:10.487361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.665 [2024-11-27 12:04:10.487387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.665 [2024-11-27 12:04:10.487437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.665 [2024-11-27 12:04:10.487451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.665 [2024-11-27 12:04:10.487500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.665 [2024-11-27 12:04:10.487518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.665 [2024-11-27 12:04:10.487567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.665 [2024-11-27 12:04:10.487581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.665 #15 NEW cov: 12443 ft: 14512 corp: 12/620b lim: 100 exec/s: 0 rss: 73Mb L: 99/99 MS: 1 PersAutoDict- DE: "\351\007_N\313\\\222\000"- 00:08:41.665 [2024-11-27 12:04:10.547313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.665 [2024-11-27 12:04:10.547340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.665 [2024-11-27 12:04:10.547378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.665 [2024-11-27 12:04:10.547392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.924 #16 NEW cov: 12443 ft: 14525 corp: 13/678b lim: 100 exec/s: 0 rss: 73Mb L: 58/99 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\004"- 00:08:41.924 [2024-11-27 12:04:10.607686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.924 [2024-11-27 12:04:10.607712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.924 [2024-11-27 12:04:10.607760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.924 [2024-11-27 12:04:10.607774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.924 [2024-11-27 12:04:10.607823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:41.924 [2024-11-27 12:04:10.607836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.924 [2024-11-27 12:04:10.607885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:41.924 [2024-11-27 12:04:10.607915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.924 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:41.924 #17 NEW cov: 12466 ft: 14610 corp: 14/758b lim: 100 exec/s: 0 rss: 73Mb L: 80/99 MS: 1 InsertRepeatedBytes- 00:08:41.924 [2024-11-27 12:04:10.667652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.924 [2024-11-27 12:04:10.667677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.924 [2024-11-27 12:04:10.667712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:41.924 [2024-11-27 12:04:10.667726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.924 #18 NEW cov: 12466 ft: 14667 corp: 15/816b lim: 100 exec/s: 18 rss: 73Mb L: 58/99 MS: 1 ShuffleBytes- 00:08:41.924 [2024-11-27 12:04:10.727661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.924 [2024-11-27 12:04:10.727687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.924 #19 NEW cov: 12466 ft: 14721 corp: 16/854b lim: 100 exec/s: 19 rss: 73Mb L: 38/99 MS: 1 CrossOver- 00:08:41.924 [2024-11-27 12:04:10.767766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:41.924 [2024-11-27 12:04:10.767792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.183 #20 NEW cov: 12466 ft: 14722 corp: 17/881b lim: 100 exec/s: 20 rss: 73Mb L: 27/99 MS: 1 EraseBytes- 00:08:42.183 [2024-11-27 12:04:10.828095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.183 [2024-11-27 12:04:10.828120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.828160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.183 [2024-11-27 12:04:10.828191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.183 #21 NEW cov: 12466 ft: 14745 corp: 18/938b lim: 100 exec/s: 21 rss: 73Mb L: 57/99 MS: 1 PersAutoDict- DE: "\351\007_N\313\\\222\000"- 00:08:42.183 [2024-11-27 12:04:10.888583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.183 [2024-11-27 12:04:10.888612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.888662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.183 [2024-11-27 12:04:10.888674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.888720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.183 [2024-11-27 12:04:10.888734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.888782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.183 [2024-11-27 12:04:10.888795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.888861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:42.183 [2024-11-27 12:04:10.888876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:42.183 #22 NEW cov: 12466 ft: 14842 corp: 19/1038b lim: 100 exec/s: 22 rss: 73Mb L: 100/100 MS: 1 CrossOver- 00:08:42.183 [2024-11-27 12:04:10.928370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.183 [2024-11-27 12:04:10.928397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.928438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.183 [2024-11-27 12:04:10.928468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.183 #23 NEW cov: 12466 ft: 14874 corp: 20/1097b lim: 100 exec/s: 23 rss: 73Mb L: 59/100 MS: 1 InsertByte- 00:08:42.183 [2024-11-27 12:04:10.968702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.183 [2024-11-27 12:04:10.968729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.968777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.183 [2024-11-27 12:04:10.968791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.968839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.183 [2024-11-27 12:04:10.968853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:10.968903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.183 [2024-11-27 12:04:10.968917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.183 #24 NEW cov: 12466 ft: 14886 corp: 21/1177b lim: 100 exec/s: 24 rss: 73Mb L: 80/100 MS: 1 ChangeBinInt- 00:08:42.183 [2024-11-27 12:04:11.028642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.183 [2024-11-27 12:04:11.028667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.183 [2024-11-27 12:04:11.028701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.183 [2024-11-27 12:04:11.028715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.442 #25 NEW cov: 12466 ft: 14892 corp: 22/1236b lim: 100 exec/s: 25 rss: 74Mb L: 59/100 MS: 1 ChangeBinInt- 00:08:42.442 [2024-11-27 12:04:11.088797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.442 [2024-11-27 12:04:11.088822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.442 [2024-11-27 12:04:11.088856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.442 [2024-11-27 12:04:11.088869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.442 #26 NEW cov: 12466 ft: 14899 corp: 23/1293b lim: 100 exec/s: 26 rss: 74Mb L: 57/100 MS: 1 CrossOver- 00:08:42.442 [2024-11-27 12:04:11.128765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.442 [2024-11-27 12:04:11.128791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.442 #27 NEW cov: 12466 ft: 14973 corp: 24/1322b lim: 100 exec/s: 27 rss: 74Mb L: 29/100 MS: 1 ShuffleBytes- 00:08:42.442 [2024-11-27 12:04:11.169222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.442 [2024-11-27 12:04:11.169249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.442 [2024-11-27 12:04:11.169299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.442 [2024-11-27 12:04:11.169313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.442 [2024-11-27 12:04:11.169361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.442 [2024-11-27 12:04:11.169374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.442 [2024-11-27 12:04:11.169440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.442 [2024-11-27 12:04:11.169454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.442 #28 NEW cov: 12466 ft: 14992 corp: 25/1411b lim: 100 exec/s: 28 rss: 74Mb L: 89/100 MS: 1 InsertRepeatedBytes- 00:08:42.442 [2024-11-27 12:04:11.229090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.442 [2024-11-27 12:04:11.229115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.442 #29 NEW cov: 12466 ft: 15010 corp: 26/1448b lim: 100 exec/s: 29 rss: 74Mb L: 37/100 MS: 1 ShuffleBytes- 00:08:42.442 [2024-11-27 12:04:11.269295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.442 [2024-11-27 12:04:11.269321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.442 [2024-11-27 12:04:11.269361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.442 [2024-11-27 12:04:11.269375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.442 #31 NEW cov: 12466 ft: 15042 corp: 27/1506b lim: 100 exec/s: 31 rss: 74Mb L: 58/100 MS: 2 ChangeBit-CrossOver- 00:08:42.442 [2024-11-27 12:04:11.309398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.442 [2024-11-27 12:04:11.309425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.442 [2024-11-27 12:04:11.309458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.442 [2024-11-27 12:04:11.309472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.703 #32 NEW cov: 12466 ft: 15045 corp: 28/1552b lim: 100 exec/s: 32 rss: 74Mb L: 46/100 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\004"- 00:08:42.703 [2024-11-27 12:04:11.349785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.703 [2024-11-27 12:04:11.349812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.349865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.703 [2024-11-27 12:04:11.349878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.349927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:42.703 [2024-11-27 12:04:11.349940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.349989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:42.703 [2024-11-27 12:04:11.350004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.703 #33 NEW cov: 12466 ft: 15051 corp: 29/1650b lim: 100 exec/s: 33 rss: 74Mb L: 98/100 MS: 1 CrossOver- 00:08:42.703 [2024-11-27 12:04:11.389633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.703 [2024-11-27 12:04:11.389659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.389695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.703 [2024-11-27 12:04:11.389709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.449822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.703 [2024-11-27 12:04:11.449848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.449884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.703 [2024-11-27 12:04:11.449899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.703 #35 NEW cov: 12466 ft: 15073 corp: 30/1707b lim: 100 exec/s: 35 rss: 74Mb L: 57/100 MS: 2 PersAutoDict-ShuffleBytes- DE: "\000\000\000\000\000\000\000\004"- 00:08:42.703 [2024-11-27 12:04:11.489930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.703 [2024-11-27 12:04:11.489956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.703 [2024-11-27 12:04:11.489990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.703 [2024-11-27 12:04:11.490005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.703 #36 NEW cov: 12466 ft: 15139 corp: 31/1764b lim: 100 exec/s: 36 rss: 74Mb L: 57/100 MS: 1 CopyPart- 00:08:42.703 [2024-11-27 12:04:11.550004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.703 [2024-11-27 12:04:11.550031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.963 #37 NEW cov: 12466 ft: 15157 corp: 32/1791b lim: 100 exec/s: 37 rss: 74Mb L: 27/100 MS: 1 ChangeBit- 00:08:42.963 [2024-11-27 12:04:11.610255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.963 [2024-11-27 12:04:11.610282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.963 [2024-11-27 12:04:11.610332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:42.963 [2024-11-27 12:04:11.610347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.963 #38 NEW cov: 12466 ft: 15159 corp: 33/1848b lim: 100 exec/s: 38 rss: 74Mb L: 57/100 MS: 1 ChangeBit- 00:08:42.963 [2024-11-27 12:04:11.650220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.963 [2024-11-27 12:04:11.650247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.963 #39 NEW cov: 12466 ft: 15217 corp: 34/1877b lim: 100 exec/s: 39 rss: 74Mb L: 29/100 MS: 1 ChangeByte- 00:08:42.963 [2024-11-27 12:04:11.710403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:42.963 [2024-11-27 12:04:11.710429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.963 #40 NEW cov: 12466 ft: 15233 corp: 35/1904b lim: 100 exec/s: 20 rss: 74Mb L: 27/100 MS: 1 PersAutoDict- DE: "\351\007_N\313\\\222\000"- 00:08:42.963 #40 DONE cov: 12466 ft: 15233 corp: 35/1904b lim: 100 exec/s: 20 rss: 74Mb 00:08:42.963 ###### Recommended dictionary. ###### 00:08:42.963 "\351\007_N\313\\\222\000" # Uses: 3 00:08:42.963 "\000\000\000\000\000\000\000\004" # Uses: 2 00:08:42.963 ###### End of recommended dictionary. ###### 00:08:42.963 Done 40 runs in 2 second(s) 00:08:42.963 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:43.221 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:43.221 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.221 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:43.221 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:43.221 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:43.222 12:04:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:43.222 [2024-11-27 12:04:11.896436] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:43.222 [2024-11-27 12:04:11.896507] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1730706 ] 00:08:43.222 [2024-11-27 12:04:12.073742] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.222 [2024-11-27 12:04:12.095677] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.480 [2024-11-27 12:04:12.148023] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.480 [2024-11-27 12:04:12.164398] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:43.480 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.480 INFO: Seed: 2978712169 00:08:43.480 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:43.480 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:43.480 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:43.480 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.480 #2 INITED exec/s: 0 rss: 65Mb 00:08:43.480 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.481 This may also happen if the target rejected all inputs we tried so far 00:08:43.481 [2024-11-27 12:04:12.223231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:43.481 [2024-11-27 12:04:12.223261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.481 [2024-11-27 12:04:12.223311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:43.481 [2024-11-27 12:04:12.223329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.739 NEW_FUNC[1/713]: 0x479db8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:43.739 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.739 #30 NEW cov: 12200 ft: 12218 corp: 2/24b lim: 50 exec/s: 0 rss: 72Mb L: 23/23 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:43.739 [2024-11-27 12:04:12.554411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14974415774176759759 len:53200 00:08:43.739 [2024-11-27 12:04:12.554464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.739 [2024-11-27 12:04:12.554536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415777481871311 len:53200 00:08:43.739 [2024-11-27 12:04:12.554564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.739 [2024-11-27 12:04:12.554638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:53200 00:08:43.740 [2024-11-27 12:04:12.554664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.740 [2024-11-27 12:04:12.554734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 00:08:43.740 [2024-11-27 12:04:12.554765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.740 [2024-11-27 12:04:12.554836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:14974415777481871311 len:53200 00:08:43.740 [2024-11-27 12:04:12.554861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:43.740 NEW_FUNC[1/1]: 0xfac0b8 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:445 00:08:43.740 #32 NEW cov: 12330 ft: 13293 corp: 3/74b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:43.740 [2024-11-27 12:04:12.604053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6727636072548621661 len:23902 00:08:43.740 [2024-11-27 12:04:12.604081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.740 [2024-11-27 12:04:12.604113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6727636073941130589 len:23902 00:08:43.740 [2024-11-27 12:04:12.604128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.998 #33 NEW cov: 12336 ft: 13587 corp: 4/95b lim: 50 exec/s: 0 rss: 72Mb L: 21/50 MS: 1 InsertRepeatedBytes- 00:08:43.998 [2024-11-27 12:04:12.644123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446569248592494591 len:65536 00:08:43.998 [2024-11-27 12:04:12.644150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.998 [2024-11-27 12:04:12.644183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:43.998 [2024-11-27 12:04:12.644199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.998 #34 NEW cov: 12421 ft: 13814 corp: 5/119b lim: 50 exec/s: 0 rss: 72Mb L: 24/50 MS: 1 InsertByte- 00:08:43.998 [2024-11-27 12:04:12.704578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069588474623 len:24832 00:08:43.999 [2024-11-27 12:04:12.704608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.704654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:43.999 [2024-11-27 12:04:12.704668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.704714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6727636076669698047 len:23902 00:08:43.999 [2024-11-27 12:04:12.704730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.704778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:6727636073941130589 len:23902 00:08:43.999 [2024-11-27 12:04:12.704793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.999 #35 NEW cov: 12421 ft: 14000 corp: 6/162b lim: 50 exec/s: 0 rss: 72Mb L: 43/50 MS: 1 CrossOver- 00:08:43.999 [2024-11-27 12:04:12.764589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:43.999 [2024-11-27 12:04:12.764621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.764688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743034327465983 len:1 00:08:43.999 [2024-11-27 12:04:12.764706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.764757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:72057589742960640 len:65536 00:08:43.999 [2024-11-27 12:04:12.764772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.999 #36 NEW cov: 12421 ft: 14260 corp: 7/193b lim: 50 exec/s: 0 rss: 72Mb L: 31/50 MS: 1 CMP- DE: "\015\000\000\000\000\000\000\000"- 00:08:43.999 [2024-11-27 12:04:12.804557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:43.999 [2024-11-27 12:04:12.804584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.804626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:43.999 [2024-11-27 12:04:12.804642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.999 #37 NEW cov: 12421 ft: 14335 corp: 8/216b lim: 50 exec/s: 0 rss: 72Mb L: 23/50 MS: 1 CrossOver- 00:08:43.999 [2024-11-27 12:04:12.844821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:43.999 [2024-11-27 12:04:12.844850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.844890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:43.999 [2024-11-27 12:04:12.844906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.999 [2024-11-27 12:04:12.844955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073698738175 len:65536 00:08:43.999 [2024-11-27 12:04:12.844971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.258 #38 NEW cov: 12421 ft: 14360 corp: 9/248b lim: 50 exec/s: 0 rss: 72Mb L: 32/50 MS: 1 CopyPart- 00:08:44.258 [2024-11-27 12:04:12.904862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:44.258 [2024-11-27 12:04:12.904890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:12.904939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709511935 len:65536 00:08:44.258 [2024-11-27 12:04:12.904954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.258 #39 NEW cov: 12421 ft: 14377 corp: 10/271b lim: 50 exec/s: 0 rss: 72Mb L: 23/50 MS: 1 ChangeByte- 00:08:44.258 [2024-11-27 12:04:12.944962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:44.258 [2024-11-27 12:04:12.944990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:12.945035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18378345658663829503 len:1 00:08:44.258 [2024-11-27 12:04:12.945050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.258 #40 NEW cov: 12421 ft: 14477 corp: 11/300b lim: 50 exec/s: 0 rss: 72Mb L: 29/50 MS: 1 EraseBytes- 00:08:44.258 [2024-11-27 12:04:13.005521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069901123583 len:65536 00:08:44.258 [2024-11-27 12:04:13.005551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.005601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:44.258 [2024-11-27 12:04:13.005617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.005665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:44.258 [2024-11-27 12:04:13.005680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.005729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:44.258 [2024-11-27 12:04:13.005743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.005794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65291 00:08:44.258 [2024-11-27 12:04:13.005807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:44.258 #44 NEW cov: 12421 ft: 14562 corp: 12/350b lim: 50 exec/s: 0 rss: 73Mb L: 50/50 MS: 4 ShuffleBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:44.258 [2024-11-27 12:04:13.045497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:3329 00:08:44.258 [2024-11-27 12:04:13.045524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.045572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:281470681743360 len:65536 00:08:44.258 [2024-11-27 12:04:13.045587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.045638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65371 00:08:44.258 [2024-11-27 12:04:13.045653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.045703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:44.258 [2024-11-27 12:04:13.045718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.258 #45 NEW cov: 12421 ft: 14575 corp: 13/390b lim: 50 exec/s: 0 rss: 73Mb L: 40/50 MS: 1 PersAutoDict- DE: "\015\000\000\000\000\000\000\000"- 00:08:44.258 [2024-11-27 12:04:13.105461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18385969261500694527 len:16851 00:08:44.258 [2024-11-27 12:04:13.105487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.258 [2024-11-27 12:04:13.105538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070967656703 len:65536 00:08:44.258 [2024-11-27 12:04:13.105553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.517 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:44.517 #46 NEW cov: 12444 ft: 14616 corp: 14/414b lim: 50 exec/s: 0 rss: 73Mb L: 24/50 MS: 1 CMP- DE: "(\025\241A\322\\\222\000"- 00:08:44.517 [2024-11-27 12:04:13.165595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18385969261500694527 len:16896 00:08:44.517 [2024-11-27 12:04:13.165630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.165679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10592466320840708608 len:65536 00:08:44.517 [2024-11-27 12:04:13.165695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.517 #47 NEW cov: 12444 ft: 14657 corp: 15/438b lim: 50 exec/s: 0 rss: 73Mb L: 24/50 MS: 1 ShuffleBytes- 00:08:44.517 [2024-11-27 12:04:13.225796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18437736874454810623 len:65536 00:08:44.517 [2024-11-27 12:04:13.225823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.225867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18378345658663829503 len:1 00:08:44.517 [2024-11-27 12:04:13.225883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.517 #48 NEW cov: 12444 ft: 14696 corp: 16/467b lim: 50 exec/s: 48 rss: 73Mb L: 29/50 MS: 1 ChangeBit- 00:08:44.517 [2024-11-27 12:04:13.285900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18437736874454810623 len:65536 00:08:44.517 [2024-11-27 12:04:13.285927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.285979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18378345658663829503 len:1 00:08:44.517 [2024-11-27 12:04:13.285996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.517 #49 NEW cov: 12444 ft: 14722 corp: 17/496b lim: 50 exec/s: 49 rss: 73Mb L: 29/50 MS: 1 ShuffleBytes- 00:08:44.517 [2024-11-27 12:04:13.346223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446569248592494591 len:65536 00:08:44.517 [2024-11-27 12:04:13.346250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.346297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:44.517 [2024-11-27 12:04:13.346312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.346362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:44.517 [2024-11-27 12:04:13.346377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.517 #55 NEW cov: 12444 ft: 14756 corp: 18/530b lim: 50 exec/s: 55 rss: 73Mb L: 34/50 MS: 1 InsertRepeatedBytes- 00:08:44.517 [2024-11-27 12:04:13.386515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14302236982104015 len:53200 00:08:44.517 [2024-11-27 12:04:13.386543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.386595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415777481871311 len:53200 00:08:44.517 [2024-11-27 12:04:13.386614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.386663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:53200 00:08:44.517 [2024-11-27 12:04:13.386694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.386748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 00:08:44.517 [2024-11-27 12:04:13.386764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.517 [2024-11-27 12:04:13.386814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:14974415777481871311 len:53200 00:08:44.517 [2024-11-27 12:04:13.386830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:44.777 #56 NEW cov: 12444 ft: 14763 corp: 19/580b lim: 50 exec/s: 56 rss: 73Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:44.777 [2024-11-27 12:04:13.446482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:44.777 [2024-11-27 12:04:13.446509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.446554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415778290323407 len:53200 00:08:44.777 [2024-11-27 12:04:13.446569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.446621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:65536 00:08:44.777 [2024-11-27 12:04:13.446637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.777 #57 NEW cov: 12444 ft: 14768 corp: 20/612b lim: 50 exec/s: 57 rss: 73Mb L: 32/50 MS: 1 CrossOver- 00:08:44.777 [2024-11-27 12:04:13.486815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14974415774176723151 len:53200 00:08:44.777 [2024-11-27 12:04:13.486843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.486896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.486912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.486958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.486973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.487021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.487036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.487085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.487099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:44.777 #58 NEW cov: 12444 ft: 14831 corp: 21/662b lim: 50 exec/s: 58 rss: 73Mb L: 50/50 MS: 1 ChangeByte- 00:08:44.777 [2024-11-27 12:04:13.526941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14974415774176723151 len:53200 00:08:44.777 [2024-11-27 12:04:13.526969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.527039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.527059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.527109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.527124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.527175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5751043740627095503 len:53200 00:08:44.777 [2024-11-27 12:04:13.527191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.527240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.527255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:44.777 #59 NEW cov: 12444 ft: 14869 corp: 22/712b lim: 50 exec/s: 59 rss: 73Mb L: 50/50 MS: 1 ChangeBit- 00:08:44.777 [2024-11-27 12:04:13.586921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:44.777 [2024-11-27 12:04:13.586950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.586988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14251014050074183109 len:50630 00:08:44.777 [2024-11-27 12:04:13.587003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.587053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14251014298209207749 len:65536 00:08:44.777 [2024-11-27 12:04:13.587069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.777 #60 NEW cov: 12444 ft: 14884 corp: 23/751b lim: 50 exec/s: 60 rss: 73Mb L: 39/50 MS: 1 InsertRepeatedBytes- 00:08:44.777 [2024-11-27 12:04:13.627173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14974415774176723151 len:53200 00:08:44.777 [2024-11-27 12:04:13.627200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.627248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.627264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.627312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.627327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.627374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 00:08:44.777 [2024-11-27 12:04:13.627390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.777 [2024-11-27 12:04:13.627439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:14974415777481883647 len:53200 00:08:44.777 [2024-11-27 12:04:13.627454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:44.777 #61 NEW cov: 12444 ft: 14924 corp: 24/801b lim: 50 exec/s: 61 rss: 73Mb L: 50/50 MS: 1 CrossOver- 00:08:45.036 [2024-11-27 12:04:13.667008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:45.036 [2024-11-27 12:04:13.667036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.667082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551450 len:65536 00:08:45.036 [2024-11-27 12:04:13.667097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.036 #67 NEW cov: 12444 ft: 14975 corp: 25/824b lim: 50 exec/s: 67 rss: 73Mb L: 23/50 MS: 1 CopyPart- 00:08:45.036 [2024-11-27 12:04:13.707187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:45.036 [2024-11-27 12:04:13.707214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.707255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18378345658663829503 len:1 00:08:45.036 [2024-11-27 12:04:13.707270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.707319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18374687574888284415 len:65536 00:08:45.036 [2024-11-27 12:04:13.707334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.036 #68 NEW cov: 12444 ft: 14992 corp: 26/855b lim: 50 exec/s: 68 rss: 73Mb L: 31/50 MS: 1 CopyPart- 00:08:45.036 [2024-11-27 12:04:13.747409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:45.036 [2024-11-27 12:04:13.747435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.747486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709539279 len:65536 00:08:45.036 [2024-11-27 12:04:13.747501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.747549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65488 00:08:45.036 [2024-11-27 12:04:13.747563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.747614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 00:08:45.036 [2024-11-27 12:04:13.747630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.036 #69 NEW cov: 12444 ft: 15003 corp: 27/902b lim: 50 exec/s: 69 rss: 73Mb L: 47/50 MS: 1 InsertRepeatedBytes- 00:08:45.036 [2024-11-27 12:04:13.807473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18383693676158124031 len:65536 00:08:45.036 [2024-11-27 12:04:13.807501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.807544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743034327465983 len:1 00:08:45.036 [2024-11-27 12:04:13.807559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.036 [2024-11-27 12:04:13.807610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:72057589742960640 len:65536 00:08:45.037 [2024-11-27 12:04:13.807625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.037 #70 NEW cov: 12444 ft: 15020 corp: 28/933b lim: 50 exec/s: 70 rss: 73Mb L: 31/50 MS: 1 ChangeBinInt- 00:08:45.037 [2024-11-27 12:04:13.847550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14902075602244652750 len:52943 00:08:45.037 [2024-11-27 12:04:13.847577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.037 [2024-11-27 12:04:13.847629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14902075604643794638 len:52943 00:08:45.037 [2024-11-27 12:04:13.847645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.037 [2024-11-27 12:04:13.847695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14902075604643794638 len:52943 00:08:45.037 [2024-11-27 12:04:13.847710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.037 #72 NEW cov: 12444 ft: 15023 corp: 29/963b lim: 50 exec/s: 72 rss: 73Mb L: 30/50 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:45.037 [2024-11-27 12:04:13.887549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:45.037 [2024-11-27 12:04:13.887575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.037 [2024-11-27 12:04:13.887633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415778290323407 len:53200 00:08:45.037 [2024-11-27 12:04:13.887649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.037 #73 NEW cov: 12444 ft: 15054 corp: 30/985b lim: 50 exec/s: 73 rss: 74Mb L: 22/50 MS: 1 EraseBytes- 00:08:45.296 [2024-11-27 12:04:13.927554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18383693676158124031 len:65536 00:08:45.296 [2024-11-27 12:04:13.927582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.296 #74 NEW cov: 12444 ft: 15448 corp: 31/1003b lim: 50 exec/s: 74 rss: 74Mb L: 18/50 MS: 1 EraseBytes- 00:08:45.296 [2024-11-27 12:04:13.987883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18437736874454810623 len:12544 00:08:45.296 [2024-11-27 12:04:13.987910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:13.987962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18378345658663829503 len:1 00:08:45.296 [2024-11-27 12:04:13.987978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.296 #75 NEW cov: 12444 ft: 15453 corp: 32/1032b lim: 50 exec/s: 75 rss: 74Mb L: 29/50 MS: 1 ChangeByte- 00:08:45.296 [2024-11-27 12:04:14.048043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4742955141222110625 len:256 00:08:45.296 [2024-11-27 12:04:14.048070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:14.048105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:45.296 [2024-11-27 12:04:14.048121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.296 #76 NEW cov: 12444 ft: 15461 corp: 33/1055b lim: 50 exec/s: 76 rss: 74Mb L: 23/50 MS: 1 PersAutoDict- DE: "(\025\241A\322\\\222\000"- 00:08:45.296 [2024-11-27 12:04:14.088154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4742955141222110625 len:256 00:08:45.296 [2024-11-27 12:04:14.088185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:14.088236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:45.296 [2024-11-27 12:04:14.088252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.296 #77 NEW cov: 12444 ft: 15508 corp: 34/1078b lim: 50 exec/s: 77 rss: 74Mb L: 23/50 MS: 1 ShuffleBytes- 00:08:45.296 [2024-11-27 12:04:14.148656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14302236982104015 len:53200 00:08:45.296 [2024-11-27 12:04:14.148684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:14.148739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14974415777481871311 len:53200 00:08:45.296 [2024-11-27 12:04:14.148754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:14.148803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14974415777481871311 len:53200 00:08:45.296 [2024-11-27 12:04:14.148818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:14.148866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 00:08:45.296 [2024-11-27 12:04:14.148881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.296 [2024-11-27 12:04:14.148932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:14974415777481871311 len:53200 00:08:45.296 [2024-11-27 12:04:14.148946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.555 #78 NEW cov: 12444 ft: 15515 corp: 35/1128b lim: 50 exec/s: 78 rss: 74Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:45.555 [2024-11-27 12:04:14.208600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070941310975 len:65536 00:08:45.555 [2024-11-27 12:04:14.208627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.555 [2024-11-27 12:04:14.208664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:45.555 [2024-11-27 12:04:14.208680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.555 [2024-11-27 12:04:14.208728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:648518350625505279 len:65536 00:08:45.555 [2024-11-27 12:04:14.208742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.555 #79 NEW cov: 12444 ft: 15566 corp: 36/1160b lim: 50 exec/s: 39 rss: 74Mb L: 32/50 MS: 1 ChangeBinInt- 00:08:45.555 #79 DONE cov: 12444 ft: 15566 corp: 36/1160b lim: 50 exec/s: 39 rss: 74Mb 00:08:45.555 ###### Recommended dictionary. ###### 00:08:45.555 "\015\000\000\000\000\000\000\000" # Uses: 1 00:08:45.555 "(\025\241A\322\\\222\000" # Uses: 1 00:08:45.555 ###### End of recommended dictionary. ###### 00:08:45.555 Done 79 runs in 2 second(s) 00:08:45.555 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:45.555 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:45.555 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.555 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:45.556 12:04:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:45.556 [2024-11-27 12:04:14.393003] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:45.556 [2024-11-27 12:04:14.393077] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1730995 ] 00:08:45.816 [2024-11-27 12:04:14.576223] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.816 [2024-11-27 12:04:14.598408] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.816 [2024-11-27 12:04:14.650687] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.816 [2024-11-27 12:04:14.667045] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:45.816 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.816 INFO: Seed: 1187747447 00:08:45.816 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:45.816 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:45.816 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:45.816 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.816 #2 INITED exec/s: 0 rss: 65Mb 00:08:45.816 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.816 This may also happen if the target rejected all inputs we tried so far 00:08:46.075 [2024-11-27 12:04:14.712348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.075 [2024-11-27 12:04:14.712380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.075 [2024-11-27 12:04:14.712432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.075 [2024-11-27 12:04:14.712449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.335 NEW_FUNC[1/716]: 0x47b978 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:46.335 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:46.335 #12 NEW cov: 12273 ft: 12274 corp: 2/51b lim: 90 exec/s: 0 rss: 72Mb L: 50/50 MS: 5 CrossOver-EraseBytes-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:46.335 [2024-11-27 12:04:15.023161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.335 [2024-11-27 12:04:15.023196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.335 [2024-11-27 12:04:15.023252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.335 [2024-11-27 12:04:15.023268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.335 #23 NEW cov: 12388 ft: 12771 corp: 3/101b lim: 90 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:46.335 [2024-11-27 12:04:15.083231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.335 [2024-11-27 12:04:15.083258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.335 [2024-11-27 12:04:15.083309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.335 [2024-11-27 12:04:15.083325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.335 #29 NEW cov: 12394 ft: 13072 corp: 4/151b lim: 90 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ChangeBit- 00:08:46.335 [2024-11-27 12:04:15.143243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.335 [2024-11-27 12:04:15.143270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.335 #31 NEW cov: 12479 ft: 14056 corp: 5/185b lim: 90 exec/s: 0 rss: 72Mb L: 34/50 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:46.335 [2024-11-27 12:04:15.183491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.335 [2024-11-27 12:04:15.183519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.335 [2024-11-27 12:04:15.183569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.335 [2024-11-27 12:04:15.183584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.335 #38 NEW cov: 12479 ft: 14194 corp: 6/221b lim: 90 exec/s: 0 rss: 72Mb L: 36/50 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:46.595 [2024-11-27 12:04:15.223621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.595 [2024-11-27 12:04:15.223649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.595 [2024-11-27 12:04:15.223685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.595 [2024-11-27 12:04:15.223702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.595 #43 NEW cov: 12479 ft: 14235 corp: 7/261b lim: 90 exec/s: 0 rss: 72Mb L: 40/50 MS: 5 ChangeBit-ShuffleBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:46.595 [2024-11-27 12:04:15.263719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.595 [2024-11-27 12:04:15.263746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.595 [2024-11-27 12:04:15.263783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.595 [2024-11-27 12:04:15.263799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.595 #49 NEW cov: 12479 ft: 14311 corp: 8/313b lim: 90 exec/s: 0 rss: 72Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:08:46.595 [2024-11-27 12:04:15.323902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.595 [2024-11-27 12:04:15.323929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.595 [2024-11-27 12:04:15.323964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.595 [2024-11-27 12:04:15.323980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.595 #50 NEW cov: 12479 ft: 14336 corp: 9/366b lim: 90 exec/s: 0 rss: 72Mb L: 53/53 MS: 1 InsertRepeatedBytes- 00:08:46.595 [2024-11-27 12:04:15.384064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.595 [2024-11-27 12:04:15.384090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.595 [2024-11-27 12:04:15.384128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.595 [2024-11-27 12:04:15.384144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.595 #51 NEW cov: 12479 ft: 14391 corp: 10/403b lim: 90 exec/s: 0 rss: 72Mb L: 37/53 MS: 1 EraseBytes- 00:08:46.595 [2024-11-27 12:04:15.444233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.595 [2024-11-27 12:04:15.444261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.595 [2024-11-27 12:04:15.444310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.595 [2024-11-27 12:04:15.444325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.855 #52 NEW cov: 12479 ft: 14479 corp: 11/442b lim: 90 exec/s: 0 rss: 73Mb L: 39/53 MS: 1 EraseBytes- 00:08:46.855 [2024-11-27 12:04:15.504400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.855 [2024-11-27 12:04:15.504427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.855 [2024-11-27 12:04:15.504476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.855 [2024-11-27 12:04:15.504491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.855 #53 NEW cov: 12479 ft: 14496 corp: 12/478b lim: 90 exec/s: 0 rss: 73Mb L: 36/53 MS: 1 ChangeByte- 00:08:46.855 [2024-11-27 12:04:15.564552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.855 [2024-11-27 12:04:15.564580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.855 [2024-11-27 12:04:15.564635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.855 [2024-11-27 12:04:15.564652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.855 #54 NEW cov: 12479 ft: 14561 corp: 13/528b lim: 90 exec/s: 0 rss: 73Mb L: 50/53 MS: 1 ChangeBinInt- 00:08:46.855 [2024-11-27 12:04:15.604471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.855 [2024-11-27 12:04:15.604500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.855 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:46.855 #55 NEW cov: 12502 ft: 14621 corp: 14/562b lim: 90 exec/s: 0 rss: 73Mb L: 34/53 MS: 1 ChangeByte- 00:08:46.855 [2024-11-27 12:04:15.644762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.855 [2024-11-27 12:04:15.644790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.855 [2024-11-27 12:04:15.644832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.855 [2024-11-27 12:04:15.644846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.855 #56 NEW cov: 12502 ft: 14647 corp: 15/598b lim: 90 exec/s: 0 rss: 73Mb L: 36/53 MS: 1 ChangeBit- 00:08:46.855 [2024-11-27 12:04:15.684869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.855 [2024-11-27 12:04:15.684898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.855 [2024-11-27 12:04:15.684948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.855 [2024-11-27 12:04:15.684963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.855 #57 NEW cov: 12502 ft: 14660 corp: 16/649b lim: 90 exec/s: 57 rss: 73Mb L: 51/53 MS: 1 InsertByte- 00:08:46.855 [2024-11-27 12:04:15.724999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:46.855 [2024-11-27 12:04:15.725025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.855 [2024-11-27 12:04:15.725073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:46.855 [2024-11-27 12:04:15.725088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.113 #58 NEW cov: 12502 ft: 14665 corp: 17/689b lim: 90 exec/s: 58 rss: 73Mb L: 40/53 MS: 1 ChangeBit- 00:08:47.113 [2024-11-27 12:04:15.765119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.113 [2024-11-27 12:04:15.765146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.113 [2024-11-27 12:04:15.765193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.113 [2024-11-27 12:04:15.765208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.114 #59 NEW cov: 12502 ft: 14696 corp: 18/740b lim: 90 exec/s: 59 rss: 73Mb L: 51/53 MS: 1 ChangeBit- 00:08:47.114 [2024-11-27 12:04:15.825285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.114 [2024-11-27 12:04:15.825313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.114 [2024-11-27 12:04:15.825359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.114 [2024-11-27 12:04:15.825374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.114 #60 NEW cov: 12502 ft: 14751 corp: 19/790b lim: 90 exec/s: 60 rss: 73Mb L: 50/53 MS: 1 ShuffleBytes- 00:08:47.114 [2024-11-27 12:04:15.885456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.114 [2024-11-27 12:04:15.885482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.114 [2024-11-27 12:04:15.885521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.114 [2024-11-27 12:04:15.885536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.114 #61 NEW cov: 12502 ft: 14778 corp: 20/841b lim: 90 exec/s: 61 rss: 73Mb L: 51/53 MS: 1 ShuffleBytes- 00:08:47.114 [2024-11-27 12:04:15.925866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.114 [2024-11-27 12:04:15.925893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.114 [2024-11-27 12:04:15.925928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.114 [2024-11-27 12:04:15.925943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.114 [2024-11-27 12:04:15.925991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.114 [2024-11-27 12:04:15.926007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.114 [2024-11-27 12:04:15.926060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:47.114 [2024-11-27 12:04:15.926075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.114 #62 NEW cov: 12502 ft: 15188 corp: 21/922b lim: 90 exec/s: 62 rss: 73Mb L: 81/81 MS: 1 CrossOver- 00:08:47.114 [2024-11-27 12:04:15.985569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.114 [2024-11-27 12:04:15.985604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.373 #63 NEW cov: 12502 ft: 15227 corp: 22/956b lim: 90 exec/s: 63 rss: 73Mb L: 34/81 MS: 1 ShuffleBytes- 00:08:47.373 [2024-11-27 12:04:16.025808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.373 [2024-11-27 12:04:16.025835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.373 [2024-11-27 12:04:16.025870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.373 [2024-11-27 12:04:16.025886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.373 #64 NEW cov: 12502 ft: 15306 corp: 23/1000b lim: 90 exec/s: 64 rss: 73Mb L: 44/81 MS: 1 CopyPart- 00:08:47.373 [2024-11-27 12:04:16.065926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.373 [2024-11-27 12:04:16.065954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.373 [2024-11-27 12:04:16.065990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.373 [2024-11-27 12:04:16.066006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.373 #65 NEW cov: 12502 ft: 15316 corp: 24/1052b lim: 90 exec/s: 65 rss: 73Mb L: 52/81 MS: 1 ChangeBit- 00:08:47.373 [2024-11-27 12:04:16.106064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.373 [2024-11-27 12:04:16.106093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.373 [2024-11-27 12:04:16.106145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.373 [2024-11-27 12:04:16.106162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.373 #66 NEW cov: 12502 ft: 15357 corp: 25/1096b lim: 90 exec/s: 66 rss: 73Mb L: 44/81 MS: 1 CrossOver- 00:08:47.373 [2024-11-27 12:04:16.166068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.373 [2024-11-27 12:04:16.166095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.373 #67 NEW cov: 12502 ft: 15370 corp: 26/1130b lim: 90 exec/s: 67 rss: 73Mb L: 34/81 MS: 1 ShuffleBytes- 00:08:47.373 [2024-11-27 12:04:16.226523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.373 [2024-11-27 12:04:16.226551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.373 [2024-11-27 12:04:16.226586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.373 [2024-11-27 12:04:16.226605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.373 [2024-11-27 12:04:16.226624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.373 [2024-11-27 12:04:16.226638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.373 #68 NEW cov: 12502 ft: 15644 corp: 27/1192b lim: 90 exec/s: 68 rss: 73Mb L: 62/81 MS: 1 CopyPart- 00:08:47.633 [2024-11-27 12:04:16.266458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.633 [2024-11-27 12:04:16.266486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.633 [2024-11-27 12:04:16.266521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.633 [2024-11-27 12:04:16.266536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.633 #69 NEW cov: 12502 ft: 15669 corp: 28/1228b lim: 90 exec/s: 69 rss: 73Mb L: 36/81 MS: 1 ChangeByte- 00:08:47.633 [2024-11-27 12:04:16.306611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.633 [2024-11-27 12:04:16.306640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.633 [2024-11-27 12:04:16.306688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.633 [2024-11-27 12:04:16.306704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.633 #70 NEW cov: 12502 ft: 15675 corp: 29/1268b lim: 90 exec/s: 70 rss: 73Mb L: 40/81 MS: 1 ChangeBit- 00:08:47.633 [2024-11-27 12:04:16.366806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.633 [2024-11-27 12:04:16.366832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.633 [2024-11-27 12:04:16.366871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.633 [2024-11-27 12:04:16.366885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.634 #71 NEW cov: 12502 ft: 15679 corp: 30/1319b lim: 90 exec/s: 71 rss: 74Mb L: 51/81 MS: 1 ChangeBit- 00:08:47.634 [2024-11-27 12:04:16.406871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.634 [2024-11-27 12:04:16.406898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.634 [2024-11-27 12:04:16.406933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.634 [2024-11-27 12:04:16.406947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.634 #72 NEW cov: 12502 ft: 15702 corp: 31/1363b lim: 90 exec/s: 72 rss: 74Mb L: 44/81 MS: 1 ChangeBinInt- 00:08:47.634 [2024-11-27 12:04:16.467093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.634 [2024-11-27 12:04:16.467122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.634 [2024-11-27 12:04:16.467158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.634 [2024-11-27 12:04:16.467171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.634 [2024-11-27 12:04:16.467221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.634 [2024-11-27 12:04:16.467236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.634 #73 NEW cov: 12502 ft: 15766 corp: 32/1432b lim: 90 exec/s: 73 rss: 74Mb L: 69/81 MS: 1 InsertRepeatedBytes- 00:08:47.634 [2024-11-27 12:04:16.507157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.634 [2024-11-27 12:04:16.507184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.634 [2024-11-27 12:04:16.507222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.634 [2024-11-27 12:04:16.507237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.893 #74 NEW cov: 12502 ft: 15824 corp: 33/1473b lim: 90 exec/s: 74 rss: 74Mb L: 41/81 MS: 1 InsertByte- 00:08:47.893 [2024-11-27 12:04:16.547399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.893 [2024-11-27 12:04:16.547425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.893 [2024-11-27 12:04:16.547471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.893 [2024-11-27 12:04:16.547486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.893 [2024-11-27 12:04:16.547534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:47.893 [2024-11-27 12:04:16.547549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.893 #75 NEW cov: 12502 ft: 15839 corp: 34/1541b lim: 90 exec/s: 75 rss: 74Mb L: 68/81 MS: 1 InsertRepeatedBytes- 00:08:47.893 [2024-11-27 12:04:16.607448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.893 [2024-11-27 12:04:16.607474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.893 [2024-11-27 12:04:16.607514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.893 [2024-11-27 12:04:16.607529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.893 #76 NEW cov: 12502 ft: 15855 corp: 35/1585b lim: 90 exec/s: 76 rss: 74Mb L: 44/81 MS: 1 ChangeBinInt- 00:08:47.893 [2024-11-27 12:04:16.647527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.893 [2024-11-27 12:04:16.647553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.893 [2024-11-27 12:04:16.647591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.894 [2024-11-27 12:04:16.647611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.894 #77 NEW cov: 12502 ft: 15874 corp: 36/1621b lim: 90 exec/s: 77 rss: 74Mb L: 36/81 MS: 1 ChangeByte- 00:08:47.894 [2024-11-27 12:04:16.687679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:47.894 [2024-11-27 12:04:16.687711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.894 [2024-11-27 12:04:16.687757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:47.894 [2024-11-27 12:04:16.687772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.894 #78 NEW cov: 12502 ft: 15889 corp: 37/1661b lim: 90 exec/s: 39 rss: 74Mb L: 40/81 MS: 1 CopyPart- 00:08:47.894 #78 DONE cov: 12502 ft: 15889 corp: 37/1661b lim: 90 exec/s: 39 rss: 74Mb 00:08:47.894 Done 78 runs in 2 second(s) 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:48.153 12:04:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:48.153 [2024-11-27 12:04:16.872159] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:48.153 [2024-11-27 12:04:16.872230] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1731527 ] 00:08:48.412 [2024-11-27 12:04:17.050995] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.412 [2024-11-27 12:04:17.072925] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.412 [2024-11-27 12:04:17.125294] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:48.412 [2024-11-27 12:04:17.141641] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:48.412 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.412 INFO: Seed: 3661761117 00:08:48.412 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:48.412 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:48.412 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:48.412 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.412 #2 INITED exec/s: 0 rss: 65Mb 00:08:48.412 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.412 This may also happen if the target rejected all inputs we tried so far 00:08:48.412 [2024-11-27 12:04:17.217635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.412 [2024-11-27 12:04:17.217671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.671 NEW_FUNC[1/716]: 0x47eba8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:48.671 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.671 #6 NEW cov: 12232 ft: 12232 corp: 2/13b lim: 50 exec/s: 0 rss: 72Mb L: 12/12 MS: 4 InsertByte-CrossOver-InsertByte-CMP- DE: "n\000\000\000\000\000\000\000"- 00:08:48.930 [2024-11-27 12:04:17.558430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.930 [2024-11-27 12:04:17.558475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.930 #7 NEW cov: 12362 ft: 12870 corp: 3/25b lim: 50 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 ChangeBinInt- 00:08:48.930 [2024-11-27 12:04:17.628567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.930 [2024-11-27 12:04:17.628606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.930 #8 NEW cov: 12368 ft: 13058 corp: 4/37b lim: 50 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:08:48.930 [2024-11-27 12:04:17.678752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.930 [2024-11-27 12:04:17.678791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.930 #11 NEW cov: 12453 ft: 13335 corp: 5/47b lim: 50 exec/s: 0 rss: 72Mb L: 10/12 MS: 3 PersAutoDict-ChangeBinInt-InsertByte- DE: "n\000\000\000\000\000\000\000"- 00:08:48.930 [2024-11-27 12:04:17.728947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.930 [2024-11-27 12:04:17.728979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.930 #17 NEW cov: 12453 ft: 13444 corp: 6/58b lim: 50 exec/s: 0 rss: 72Mb L: 11/12 MS: 1 InsertByte- 00:08:48.930 [2024-11-27 12:04:17.799452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.930 [2024-11-27 12:04:17.799486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.930 [2024-11-27 12:04:17.799612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.930 [2024-11-27 12:04:17.799635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.190 #18 NEW cov: 12453 ft: 14286 corp: 7/78b lim: 50 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:08:49.190 [2024-11-27 12:04:17.869360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.190 [2024-11-27 12:04:17.869396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.190 #19 NEW cov: 12453 ft: 14370 corp: 8/90b lim: 50 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 CrossOver- 00:08:49.190 [2024-11-27 12:04:17.919494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.190 [2024-11-27 12:04:17.919528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.190 #20 NEW cov: 12453 ft: 14430 corp: 9/102b lim: 50 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 InsertByte- 00:08:49.190 [2024-11-27 12:04:17.989717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.190 [2024-11-27 12:04:17.989752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.190 #21 NEW cov: 12453 ft: 14496 corp: 10/113b lim: 50 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 ChangeBit- 00:08:49.190 [2024-11-27 12:04:18.039821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.190 [2024-11-27 12:04:18.039846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.190 #22 NEW cov: 12453 ft: 14526 corp: 11/126b lim: 50 exec/s: 0 rss: 72Mb L: 13/20 MS: 1 CrossOver- 00:08:49.449 [2024-11-27 12:04:18.089976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.449 [2024-11-27 12:04:18.090008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.449 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:49.449 #23 NEW cov: 12476 ft: 14638 corp: 12/136b lim: 50 exec/s: 0 rss: 73Mb L: 10/20 MS: 1 ChangeBit- 00:08:49.449 [2024-11-27 12:04:18.140146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.449 [2024-11-27 12:04:18.140172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.449 #24 NEW cov: 12476 ft: 14686 corp: 13/148b lim: 50 exec/s: 0 rss: 73Mb L: 12/20 MS: 1 ChangeByte- 00:08:49.449 [2024-11-27 12:04:18.210272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.449 [2024-11-27 12:04:18.210298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.449 #26 NEW cov: 12476 ft: 14800 corp: 14/162b lim: 50 exec/s: 26 rss: 73Mb L: 14/20 MS: 2 InsertByte-CrossOver- 00:08:49.449 [2024-11-27 12:04:18.261250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.449 [2024-11-27 12:04:18.261281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.449 [2024-11-27 12:04:18.261371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.449 [2024-11-27 12:04:18.261396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.449 [2024-11-27 12:04:18.261520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.449 [2024-11-27 12:04:18.261541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.450 [2024-11-27 12:04:18.261666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.450 [2024-11-27 12:04:18.261693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.450 #27 NEW cov: 12476 ft: 15181 corp: 15/206b lim: 50 exec/s: 27 rss: 73Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:49.450 [2024-11-27 12:04:18.330692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.450 [2024-11-27 12:04:18.330731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.709 #28 NEW cov: 12476 ft: 15212 corp: 16/218b lim: 50 exec/s: 28 rss: 73Mb L: 12/44 MS: 1 CopyPart- 00:08:49.709 [2024-11-27 12:04:18.380880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.709 [2024-11-27 12:04:18.380909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.709 #29 NEW cov: 12476 ft: 15234 corp: 17/228b lim: 50 exec/s: 29 rss: 73Mb L: 10/44 MS: 1 ChangeByte- 00:08:49.709 [2024-11-27 12:04:18.451384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.709 [2024-11-27 12:04:18.451420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.709 [2024-11-27 12:04:18.451560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.709 [2024-11-27 12:04:18.451584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.709 #34 NEW cov: 12476 ft: 15311 corp: 18/251b lim: 50 exec/s: 34 rss: 73Mb L: 23/44 MS: 5 EraseBytes-ChangeBit-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:49.709 [2024-11-27 12:04:18.521270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.709 [2024-11-27 12:04:18.521295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.709 #35 NEW cov: 12476 ft: 15318 corp: 19/262b lim: 50 exec/s: 35 rss: 73Mb L: 11/44 MS: 1 InsertByte- 00:08:49.709 [2024-11-27 12:04:18.571413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.709 [2024-11-27 12:04:18.571445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.969 #36 NEW cov: 12476 ft: 15360 corp: 20/275b lim: 50 exec/s: 36 rss: 73Mb L: 13/44 MS: 1 ChangeBit- 00:08:49.969 [2024-11-27 12:04:18.642322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.969 [2024-11-27 12:04:18.642351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.969 [2024-11-27 12:04:18.642433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.969 [2024-11-27 12:04:18.642455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.969 [2024-11-27 12:04:18.642576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.969 [2024-11-27 12:04:18.642596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.969 [2024-11-27 12:04:18.642716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.969 [2024-11-27 12:04:18.642738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.969 #37 NEW cov: 12476 ft: 15374 corp: 21/322b lim: 50 exec/s: 37 rss: 73Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:49.969 [2024-11-27 12:04:18.712621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.969 [2024-11-27 12:04:18.712651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.969 [2024-11-27 12:04:18.712766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:49.969 [2024-11-27 12:04:18.712801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.969 [2024-11-27 12:04:18.712920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:49.969 [2024-11-27 12:04:18.712941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.969 [2024-11-27 12:04:18.713065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:49.969 [2024-11-27 12:04:18.713087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.969 #38 NEW cov: 12476 ft: 15380 corp: 22/370b lim: 50 exec/s: 38 rss: 73Mb L: 48/48 MS: 1 InsertByte- 00:08:49.969 [2024-11-27 12:04:18.782045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.969 [2024-11-27 12:04:18.782079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.969 #40 NEW cov: 12476 ft: 15386 corp: 23/380b lim: 50 exec/s: 40 rss: 73Mb L: 10/48 MS: 2 EraseBytes-InsertByte- 00:08:49.969 [2024-11-27 12:04:18.852225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:49.969 [2024-11-27 12:04:18.852262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.229 #41 NEW cov: 12476 ft: 15401 corp: 24/392b lim: 50 exec/s: 41 rss: 73Mb L: 12/48 MS: 1 InsertByte- 00:08:50.229 [2024-11-27 12:04:18.902800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.229 [2024-11-27 12:04:18.902836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.229 [2024-11-27 12:04:18.902979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:50.229 [2024-11-27 12:04:18.903003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.229 #42 NEW cov: 12476 ft: 15417 corp: 25/412b lim: 50 exec/s: 42 rss: 74Mb L: 20/48 MS: 1 ChangeBinInt- 00:08:50.229 [2024-11-27 12:04:18.972569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.229 [2024-11-27 12:04:18.972595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.229 #43 NEW cov: 12476 ft: 15458 corp: 26/422b lim: 50 exec/s: 43 rss: 74Mb L: 10/48 MS: 1 PersAutoDict- DE: "n\000\000\000\000\000\000\000"- 00:08:50.229 [2024-11-27 12:04:19.042925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.229 [2024-11-27 12:04:19.042952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.229 #44 NEW cov: 12476 ft: 15467 corp: 27/434b lim: 50 exec/s: 44 rss: 74Mb L: 12/48 MS: 1 InsertByte- 00:08:50.229 [2024-11-27 12:04:19.113133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.229 [2024-11-27 12:04:19.113167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.489 #45 NEW cov: 12476 ft: 15481 corp: 28/453b lim: 50 exec/s: 45 rss: 74Mb L: 19/48 MS: 1 EraseBytes- 00:08:50.489 [2024-11-27 12:04:19.183254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:50.489 [2024-11-27 12:04:19.183288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.489 #48 NEW cov: 12476 ft: 15494 corp: 29/468b lim: 50 exec/s: 24 rss: 74Mb L: 15/48 MS: 3 EraseBytes-ChangeBit-CMP- DE: "1\264\210&\325\\\222\000"- 00:08:50.489 #48 DONE cov: 12476 ft: 15494 corp: 29/468b lim: 50 exec/s: 24 rss: 74Mb 00:08:50.489 ###### Recommended dictionary. ###### 00:08:50.489 "n\000\000\000\000\000\000\000" # Uses: 4 00:08:50.489 "1\264\210&\325\\\222\000" # Uses: 0 00:08:50.489 ###### End of recommended dictionary. ###### 00:08:50.489 Done 48 runs in 2 second(s) 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.489 12:04:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:50.784 [2024-11-27 12:04:19.387983] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:50.784 [2024-11-27 12:04:19.388052] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1731992 ] 00:08:50.784 [2024-11-27 12:04:19.570940] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.784 [2024-11-27 12:04:19.592738] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.089 [2024-11-27 12:04:19.644943] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:51.089 [2024-11-27 12:04:19.661300] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:51.089 INFO: Running with entropic power schedule (0xFF, 100). 00:08:51.089 INFO: Seed: 1887794635 00:08:51.089 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:51.089 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:51.089 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:51.089 INFO: A corpus is not provided, starting from an empty corpus 00:08:51.089 #2 INITED exec/s: 0 rss: 65Mb 00:08:51.089 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:51.089 This may also happen if the target rejected all inputs we tried so far 00:08:51.089 [2024-11-27 12:04:19.727014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.089 [2024-11-27 12:04:19.727044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.089 [2024-11-27 12:04:19.727085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.089 [2024-11-27 12:04:19.727101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.089 [2024-11-27 12:04:19.727155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.089 [2024-11-27 12:04:19.727174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.089 [2024-11-27 12:04:19.727227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.089 [2024-11-27 12:04:19.727242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 NEW_FUNC[1/715]: 0x480e78 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:51.372 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:51.372 #7 NEW cov: 12250 ft: 12255 corp: 2/76b lim: 85 exec/s: 0 rss: 72Mb L: 75/75 MS: 5 CrossOver-EraseBytes-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:51.372 [2024-11-27 12:04:20.089123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-11-27 12:04:20.089188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-11-27 12:04:20.089324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-11-27 12:04:20.089357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-11-27 12:04:20.089494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-11-27 12:04:20.089525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-11-27 12:04:20.089682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-11-27 12:04:20.089713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.372 NEW_FUNC[1/1]: 0x193ad68 in nvme_qpair_check_enabled /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:636 00:08:51.372 #13 NEW cov: 12388 ft: 13027 corp: 3/151b lim: 85 exec/s: 0 rss: 72Mb L: 75/75 MS: 1 ShuffleBytes- 00:08:51.372 [2024-11-27 12:04:20.168362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-11-27 12:04:20.168392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 #18 NEW cov: 12394 ft: 14073 corp: 4/183b lim: 85 exec/s: 0 rss: 72Mb L: 32/75 MS: 5 CopyPart-ChangeByte-InsertByte-EraseBytes-CrossOver- 00:08:51.372 [2024-11-27 12:04:20.229341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.372 [2024-11-27 12:04:20.229378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.372 [2024-11-27 12:04:20.229491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.372 [2024-11-27 12:04:20.229512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.372 [2024-11-27 12:04:20.229628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.372 [2024-11-27 12:04:20.229652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.372 [2024-11-27 12:04:20.229762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.372 [2024-11-27 12:04:20.229784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.645 #19 NEW cov: 12479 ft: 14281 corp: 5/258b lim: 85 exec/s: 0 rss: 72Mb L: 75/75 MS: 1 ShuffleBytes- 00:08:51.645 [2024-11-27 12:04:20.299484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.645 [2024-11-27 12:04:20.299522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.299617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.645 [2024-11-27 12:04:20.299638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.299765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.645 [2024-11-27 12:04:20.299788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.299901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.645 [2024-11-27 12:04:20.299920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.645 #20 NEW cov: 12479 ft: 14366 corp: 6/333b lim: 85 exec/s: 0 rss: 72Mb L: 75/75 MS: 1 ShuffleBytes- 00:08:51.645 [2024-11-27 12:04:20.349088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.645 [2024-11-27 12:04:20.349123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.349234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.645 [2024-11-27 12:04:20.349252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.645 #21 NEW cov: 12479 ft: 14766 corp: 7/377b lim: 85 exec/s: 0 rss: 72Mb L: 44/75 MS: 1 EraseBytes- 00:08:51.645 [2024-11-27 12:04:20.399816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.645 [2024-11-27 12:04:20.399844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.399955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.645 [2024-11-27 12:04:20.399980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.400092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.645 [2024-11-27 12:04:20.400115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.400225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.645 [2024-11-27 12:04:20.400245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.645 #22 NEW cov: 12479 ft: 14830 corp: 8/453b lim: 85 exec/s: 0 rss: 72Mb L: 76/76 MS: 1 InsertByte- 00:08:51.645 [2024-11-27 12:04:20.470034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.645 [2024-11-27 12:04:20.470074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.470179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.645 [2024-11-27 12:04:20.470202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.470320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:51.645 [2024-11-27 12:04:20.470343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.645 [2024-11-27 12:04:20.470464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:51.645 [2024-11-27 12:04:20.470482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.645 #23 NEW cov: 12479 ft: 14866 corp: 9/528b lim: 85 exec/s: 0 rss: 72Mb L: 75/76 MS: 1 ShuffleBytes- 00:08:51.934 [2024-11-27 12:04:20.539772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.934 [2024-11-27 12:04:20.539806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.934 [2024-11-27 12:04:20.539944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.934 [2024-11-27 12:04:20.539964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.934 #24 NEW cov: 12479 ft: 14909 corp: 10/564b lim: 85 exec/s: 0 rss: 72Mb L: 36/76 MS: 1 InsertRepeatedBytes- 00:08:51.934 [2024-11-27 12:04:20.609979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.934 [2024-11-27 12:04:20.610014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.934 [2024-11-27 12:04:20.610134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.934 [2024-11-27 12:04:20.610156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.934 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:51.934 #25 NEW cov: 12502 ft: 15002 corp: 11/601b lim: 85 exec/s: 0 rss: 73Mb L: 37/76 MS: 1 CrossOver- 00:08:51.934 [2024-11-27 12:04:20.680187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.934 [2024-11-27 12:04:20.680220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.934 [2024-11-27 12:04:20.680377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.934 [2024-11-27 12:04:20.680401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.934 #26 NEW cov: 12502 ft: 15059 corp: 12/637b lim: 85 exec/s: 0 rss: 73Mb L: 36/76 MS: 1 CrossOver- 00:08:51.934 [2024-11-27 12:04:20.730309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.934 [2024-11-27 12:04:20.730344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.934 [2024-11-27 12:04:20.730460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:51.934 [2024-11-27 12:04:20.730480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.934 #27 NEW cov: 12502 ft: 15106 corp: 13/674b lim: 85 exec/s: 27 rss: 73Mb L: 37/76 MS: 1 CopyPart- 00:08:51.934 [2024-11-27 12:04:20.800295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:51.934 [2024-11-27 12:04:20.800328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.217 #28 NEW cov: 12502 ft: 15148 corp: 14/707b lim: 85 exec/s: 28 rss: 73Mb L: 33/76 MS: 1 EraseBytes- 00:08:52.217 [2024-11-27 12:04:20.850692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.217 [2024-11-27 12:04:20.850727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.217 [2024-11-27 12:04:20.850847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.217 [2024-11-27 12:04:20.850873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.217 #29 NEW cov: 12502 ft: 15257 corp: 15/743b lim: 85 exec/s: 29 rss: 73Mb L: 36/76 MS: 1 ChangeBinInt- 00:08:52.217 [2024-11-27 12:04:20.901360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.217 [2024-11-27 12:04:20.901396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.217 [2024-11-27 12:04:20.901512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.217 [2024-11-27 12:04:20.901546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.217 [2024-11-27 12:04:20.901675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.218 [2024-11-27 12:04:20.901695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.218 [2024-11-27 12:04:20.901816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.218 [2024-11-27 12:04:20.901837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.218 #30 NEW cov: 12502 ft: 15277 corp: 16/820b lim: 85 exec/s: 30 rss: 73Mb L: 77/77 MS: 1 InsertByte- 00:08:52.218 [2024-11-27 12:04:20.971384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.218 [2024-11-27 12:04:20.971418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.218 [2024-11-27 12:04:20.971534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.218 [2024-11-27 12:04:20.971557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.218 [2024-11-27 12:04:20.971683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.218 [2024-11-27 12:04:20.971705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.218 #31 NEW cov: 12502 ft: 15552 corp: 17/873b lim: 85 exec/s: 31 rss: 73Mb L: 53/77 MS: 1 CopyPart- 00:08:52.218 [2024-11-27 12:04:21.021269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.218 [2024-11-27 12:04:21.021306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.218 [2024-11-27 12:04:21.021429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.218 [2024-11-27 12:04:21.021450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.218 #35 NEW cov: 12502 ft: 15561 corp: 18/919b lim: 85 exec/s: 35 rss: 73Mb L: 46/77 MS: 4 CopyPart-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:52.218 [2024-11-27 12:04:21.071158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.218 [2024-11-27 12:04:21.071191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.477 #36 NEW cov: 12502 ft: 15591 corp: 19/950b lim: 85 exec/s: 36 rss: 73Mb L: 31/77 MS: 1 EraseBytes- 00:08:52.477 [2024-11-27 12:04:21.142100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.477 [2024-11-27 12:04:21.142135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.142219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.477 [2024-11-27 12:04:21.142240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.142351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.477 [2024-11-27 12:04:21.142374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.142492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.477 [2024-11-27 12:04:21.142516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.477 #37 NEW cov: 12502 ft: 15681 corp: 20/1025b lim: 85 exec/s: 37 rss: 73Mb L: 75/77 MS: 1 CopyPart- 00:08:52.477 [2024-11-27 12:04:21.192276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.477 [2024-11-27 12:04:21.192305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.192422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.477 [2024-11-27 12:04:21.192443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.192563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.477 [2024-11-27 12:04:21.192582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.192702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.477 [2024-11-27 12:04:21.192725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.477 #38 NEW cov: 12502 ft: 15698 corp: 21/1100b lim: 85 exec/s: 38 rss: 73Mb L: 75/77 MS: 1 CopyPart- 00:08:52.477 [2024-11-27 12:04:21.262397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.477 [2024-11-27 12:04:21.262430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.262532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.477 [2024-11-27 12:04:21.262555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.262670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.477 [2024-11-27 12:04:21.262697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.477 [2024-11-27 12:04:21.262812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.478 [2024-11-27 12:04:21.262835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.478 #39 NEW cov: 12502 ft: 15710 corp: 22/1179b lim: 85 exec/s: 39 rss: 73Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:52.478 [2024-11-27 12:04:21.331914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.478 [2024-11-27 12:04:21.331945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.737 #40 NEW cov: 12502 ft: 15756 corp: 23/1211b lim: 85 exec/s: 40 rss: 73Mb L: 32/79 MS: 1 InsertByte- 00:08:52.737 [2024-11-27 12:04:21.402137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.737 [2024-11-27 12:04:21.402168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.737 #41 NEW cov: 12502 ft: 15781 corp: 24/1243b lim: 85 exec/s: 41 rss: 73Mb L: 32/79 MS: 1 CopyPart- 00:08:52.737 [2024-11-27 12:04:21.452854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.737 [2024-11-27 12:04:21.452889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.737 [2024-11-27 12:04:21.453028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.737 [2024-11-27 12:04:21.453046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.737 [2024-11-27 12:04:21.453172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.737 [2024-11-27 12:04:21.453196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.737 #42 NEW cov: 12502 ft: 15798 corp: 25/1295b lim: 85 exec/s: 42 rss: 73Mb L: 52/79 MS: 1 CopyPart- 00:08:52.737 [2024-11-27 12:04:21.522756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.737 [2024-11-27 12:04:21.522789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.737 [2024-11-27 12:04:21.522923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.737 [2024-11-27 12:04:21.522947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.737 #43 NEW cov: 12502 ft: 15803 corp: 26/1339b lim: 85 exec/s: 43 rss: 73Mb L: 44/79 MS: 1 ChangeBinInt- 00:08:52.737 [2024-11-27 12:04:21.593495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.737 [2024-11-27 12:04:21.593530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.737 [2024-11-27 12:04:21.593666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.737 [2024-11-27 12:04:21.593691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.737 [2024-11-27 12:04:21.593806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.737 [2024-11-27 12:04:21.593829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.737 [2024-11-27 12:04:21.593949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.737 [2024-11-27 12:04:21.593973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.996 #44 NEW cov: 12502 ft: 15844 corp: 27/1416b lim: 85 exec/s: 44 rss: 74Mb L: 77/79 MS: 1 ChangeByte- 00:08:52.996 [2024-11-27 12:04:21.663639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.996 [2024-11-27 12:04:21.663673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.996 [2024-11-27 12:04:21.663802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.996 [2024-11-27 12:04:21.663827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.996 [2024-11-27 12:04:21.663946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:52.996 [2024-11-27 12:04:21.663968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.996 [2024-11-27 12:04:21.664092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:52.996 [2024-11-27 12:04:21.664117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.996 #45 NEW cov: 12502 ft: 15860 corp: 28/1492b lim: 85 exec/s: 45 rss: 74Mb L: 76/79 MS: 1 ChangeByte- 00:08:52.996 [2024-11-27 12:04:21.713350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:52.996 [2024-11-27 12:04:21.713383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.996 [2024-11-27 12:04:21.713531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:52.996 [2024-11-27 12:04:21.713557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.996 #46 NEW cov: 12502 ft: 15863 corp: 29/1538b lim: 85 exec/s: 23 rss: 74Mb L: 46/79 MS: 1 ChangeByte- 00:08:52.996 #46 DONE cov: 12502 ft: 15863 corp: 29/1538b lim: 85 exec/s: 23 rss: 74Mb 00:08:52.996 Done 46 runs in 2 second(s) 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:52.996 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:53.256 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:53.256 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:53.256 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:53.256 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:53.256 12:04:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:53.256 [2024-11-27 12:04:21.918565] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:53.256 [2024-11-27 12:04:21.918650] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1732363 ] 00:08:53.256 [2024-11-27 12:04:22.108901] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.256 [2024-11-27 12:04:22.131087] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.516 [2024-11-27 12:04:22.183469] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:53.516 [2024-11-27 12:04:22.199817] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:53.516 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.516 INFO: Seed: 129837517 00:08:53.516 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:53.516 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:53.516 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:53.516 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.516 #2 INITED exec/s: 0 rss: 65Mb 00:08:53.516 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.516 This may also happen if the target rejected all inputs we tried so far 00:08:53.516 [2024-11-27 12:04:22.245346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.516 [2024-11-27 12:04:22.245376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.516 [2024-11-27 12:04:22.245415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.516 [2024-11-27 12:04:22.245431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.516 [2024-11-27 12:04:22.245486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.516 [2024-11-27 12:04:22.245501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.775 NEW_FUNC[1/715]: 0x4840b8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:53.775 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.775 #3 NEW cov: 12209 ft: 12208 corp: 2/16b lim: 25 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:53.775 [2024-11-27 12:04:22.566160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.775 [2024-11-27 12:04:22.566195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.775 [2024-11-27 12:04:22.566241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.775 [2024-11-27 12:04:22.566257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.775 [2024-11-27 12:04:22.566313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.775 [2024-11-27 12:04:22.566329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.775 #9 NEW cov: 12322 ft: 12745 corp: 3/31b lim: 25 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 ShuffleBytes- 00:08:53.775 [2024-11-27 12:04:22.626199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.775 [2024-11-27 12:04:22.626229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.775 [2024-11-27 12:04:22.626267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.775 [2024-11-27 12:04:22.626283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.775 [2024-11-27 12:04:22.626341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.775 [2024-11-27 12:04:22.626357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.775 #10 NEW cov: 12328 ft: 13187 corp: 4/47b lim: 25 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 InsertByte- 00:08:54.033 [2024-11-27 12:04:22.666306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.034 [2024-11-27 12:04:22.666338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.666377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.034 [2024-11-27 12:04:22.666393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.666450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.034 [2024-11-27 12:04:22.666466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.034 #11 NEW cov: 12413 ft: 13405 corp: 5/63b lim: 25 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 ChangeByte- 00:08:54.034 [2024-11-27 12:04:22.726589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.034 [2024-11-27 12:04:22.726624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.726675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.034 [2024-11-27 12:04:22.726691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.726748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.034 [2024-11-27 12:04:22.726764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.726822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.034 [2024-11-27 12:04:22.726837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.034 #12 NEW cov: 12413 ft: 13904 corp: 6/85b lim: 25 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:54.034 [2024-11-27 12:04:22.786817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.034 [2024-11-27 12:04:22.786845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.786894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.034 [2024-11-27 12:04:22.786911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.786967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.034 [2024-11-27 12:04:22.786984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.787042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.034 [2024-11-27 12:04:22.787058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.034 #13 NEW cov: 12413 ft: 13942 corp: 7/108b lim: 25 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 InsertByte- 00:08:54.034 [2024-11-27 12:04:22.846987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.034 [2024-11-27 12:04:22.847017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.847071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.034 [2024-11-27 12:04:22.847088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.847149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.034 [2024-11-27 12:04:22.847166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.847223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.034 [2024-11-27 12:04:22.847239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.034 #14 NEW cov: 12413 ft: 14013 corp: 8/129b lim: 25 exec/s: 0 rss: 72Mb L: 21/23 MS: 1 EraseBytes- 00:08:54.034 [2024-11-27 12:04:22.886973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.034 [2024-11-27 12:04:22.887001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.887045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.034 [2024-11-27 12:04:22.887063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.034 [2024-11-27 12:04:22.887118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.034 [2024-11-27 12:04:22.887134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.034 #15 NEW cov: 12413 ft: 14066 corp: 9/145b lim: 25 exec/s: 0 rss: 72Mb L: 16/23 MS: 1 CopyPart- 00:08:54.293 [2024-11-27 12:04:22.927181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.293 [2024-11-27 12:04:22.927209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:22.927259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.293 [2024-11-27 12:04:22.927275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:22.927331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.293 [2024-11-27 12:04:22.927347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:22.927403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.293 [2024-11-27 12:04:22.927419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.293 #16 NEW cov: 12413 ft: 14100 corp: 10/169b lim: 25 exec/s: 0 rss: 73Mb L: 24/24 MS: 1 CopyPart- 00:08:54.293 [2024-11-27 12:04:22.987240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.293 [2024-11-27 12:04:22.987269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:22.987318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.293 [2024-11-27 12:04:22.987334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:22.987392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.293 [2024-11-27 12:04:22.987408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.293 #17 NEW cov: 12413 ft: 14134 corp: 11/186b lim: 25 exec/s: 0 rss: 73Mb L: 17/24 MS: 1 InsertByte- 00:08:54.293 [2024-11-27 12:04:23.027456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.293 [2024-11-27 12:04:23.027483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.027526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.293 [2024-11-27 12:04:23.027542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.027601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.293 [2024-11-27 12:04:23.027618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.027674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.293 [2024-11-27 12:04:23.027689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.293 #18 NEW cov: 12413 ft: 14186 corp: 12/209b lim: 25 exec/s: 0 rss: 73Mb L: 23/24 MS: 1 CrossOver- 00:08:54.293 [2024-11-27 12:04:23.067569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.293 [2024-11-27 12:04:23.067603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.067653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.293 [2024-11-27 12:04:23.067669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.067727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.293 [2024-11-27 12:04:23.067744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.067801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.293 [2024-11-27 12:04:23.067817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.293 #19 NEW cov: 12413 ft: 14225 corp: 13/233b lim: 25 exec/s: 0 rss: 73Mb L: 24/24 MS: 1 InsertByte- 00:08:54.293 [2024-11-27 12:04:23.107562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.293 [2024-11-27 12:04:23.107590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.107643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.293 [2024-11-27 12:04:23.107659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.107717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.293 [2024-11-27 12:04:23.107734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.293 #20 NEW cov: 12413 ft: 14234 corp: 14/248b lim: 25 exec/s: 0 rss: 73Mb L: 15/24 MS: 1 InsertRepeatedBytes- 00:08:54.293 [2024-11-27 12:04:23.147559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.293 [2024-11-27 12:04:23.147587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.293 [2024-11-27 12:04:23.147635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.293 [2024-11-27 12:04:23.147651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.553 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:54.553 #21 NEW cov: 12436 ft: 14590 corp: 15/260b lim: 25 exec/s: 0 rss: 73Mb L: 12/24 MS: 1 EraseBytes- 00:08:54.553 [2024-11-27 12:04:23.207834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.553 [2024-11-27 12:04:23.207862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.207909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.553 [2024-11-27 12:04:23.207925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.207981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.553 [2024-11-27 12:04:23.207997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.553 #22 NEW cov: 12436 ft: 14618 corp: 16/277b lim: 25 exec/s: 22 rss: 73Mb L: 17/24 MS: 1 InsertByte- 00:08:54.553 [2024-11-27 12:04:23.267885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.553 [2024-11-27 12:04:23.267913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.267951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.553 [2024-11-27 12:04:23.267967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.553 #23 NEW cov: 12436 ft: 14722 corp: 17/289b lim: 25 exec/s: 23 rss: 73Mb L: 12/24 MS: 1 CopyPart- 00:08:54.553 [2024-11-27 12:04:23.328176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.553 [2024-11-27 12:04:23.328204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.328246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.553 [2024-11-27 12:04:23.328262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.328316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.553 [2024-11-27 12:04:23.328332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.553 #24 NEW cov: 12436 ft: 14738 corp: 18/306b lim: 25 exec/s: 24 rss: 73Mb L: 17/24 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:54.553 [2024-11-27 12:04:23.388502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.553 [2024-11-27 12:04:23.388530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.388581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.553 [2024-11-27 12:04:23.388601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.388672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.553 [2024-11-27 12:04:23.388689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.388745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.553 [2024-11-27 12:04:23.388761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.553 #25 NEW cov: 12436 ft: 14772 corp: 19/329b lim: 25 exec/s: 25 rss: 73Mb L: 23/24 MS: 1 InsertRepeatedBytes- 00:08:54.553 [2024-11-27 12:04:23.428548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.553 [2024-11-27 12:04:23.428580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.428625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.553 [2024-11-27 12:04:23.428641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.553 [2024-11-27 12:04:23.428699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.553 [2024-11-27 12:04:23.428715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.554 [2024-11-27 12:04:23.428771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.554 [2024-11-27 12:04:23.428785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.813 #26 NEW cov: 12436 ft: 14855 corp: 20/350b lim: 25 exec/s: 26 rss: 73Mb L: 21/24 MS: 1 EraseBytes- 00:08:54.813 [2024-11-27 12:04:23.468552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.813 [2024-11-27 12:04:23.468580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.813 [2024-11-27 12:04:23.468636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.813 [2024-11-27 12:04:23.468652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.813 [2024-11-27 12:04:23.468705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.813 [2024-11-27 12:04:23.468722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.813 #27 NEW cov: 12436 ft: 14884 corp: 21/367b lim: 25 exec/s: 27 rss: 73Mb L: 17/24 MS: 1 CrossOver- 00:08:54.813 [2024-11-27 12:04:23.528870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.813 [2024-11-27 12:04:23.528899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.813 [2024-11-27 12:04:23.528955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.813 [2024-11-27 12:04:23.528971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.813 [2024-11-27 12:04:23.529029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.813 [2024-11-27 12:04:23.529046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.813 [2024-11-27 12:04:23.529106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.813 [2024-11-27 12:04:23.529122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.813 #28 NEW cov: 12436 ft: 14889 corp: 22/391b lim: 25 exec/s: 28 rss: 73Mb L: 24/24 MS: 1 CopyPart- 00:08:54.813 [2024-11-27 12:04:23.589084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.813 [2024-11-27 12:04:23.589113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.589165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.814 [2024-11-27 12:04:23.589180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.589238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.814 [2024-11-27 12:04:23.589255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.589308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.814 [2024-11-27 12:04:23.589324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.814 #29 NEW cov: 12436 ft: 14895 corp: 23/413b lim: 25 exec/s: 29 rss: 73Mb L: 22/24 MS: 1 ChangeBinInt- 00:08:54.814 [2024-11-27 12:04:23.629311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.814 [2024-11-27 12:04:23.629341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.629403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.814 [2024-11-27 12:04:23.629418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.629477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.814 [2024-11-27 12:04:23.629493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.629546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.814 [2024-11-27 12:04:23.629561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.629623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:54.814 [2024-11-27 12:04:23.629639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.814 #30 NEW cov: 12436 ft: 14972 corp: 24/438b lim: 25 exec/s: 30 rss: 73Mb L: 25/25 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:54.814 [2024-11-27 12:04:23.669215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:54.814 [2024-11-27 12:04:23.669244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.669296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:54.814 [2024-11-27 12:04:23.669312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.669366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:54.814 [2024-11-27 12:04:23.669382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.814 [2024-11-27 12:04:23.669438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:54.814 [2024-11-27 12:04:23.669453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.814 #31 NEW cov: 12436 ft: 15031 corp: 25/461b lim: 25 exec/s: 31 rss: 73Mb L: 23/25 MS: 1 ChangeByte- 00:08:55.073 [2024-11-27 12:04:23.709391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.073 [2024-11-27 12:04:23.709420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.709471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.073 [2024-11-27 12:04:23.709487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.709548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.073 [2024-11-27 12:04:23.709564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.709618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.073 [2024-11-27 12:04:23.709635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.073 #35 NEW cov: 12436 ft: 15108 corp: 26/482b lim: 25 exec/s: 35 rss: 73Mb L: 21/25 MS: 4 InsertByte-CopyPart-ChangeBinInt-CrossOver- 00:08:55.073 [2024-11-27 12:04:23.749235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.073 [2024-11-27 12:04:23.749264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.749313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.073 [2024-11-27 12:04:23.749329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.073 #36 NEW cov: 12436 ft: 15161 corp: 27/494b lim: 25 exec/s: 36 rss: 73Mb L: 12/25 MS: 1 EraseBytes- 00:08:55.073 [2024-11-27 12:04:23.809700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.073 [2024-11-27 12:04:23.809729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.809786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.073 [2024-11-27 12:04:23.809803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.809857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.073 [2024-11-27 12:04:23.809873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.809929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.073 [2024-11-27 12:04:23.809946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.073 #37 NEW cov: 12436 ft: 15195 corp: 28/518b lim: 25 exec/s: 37 rss: 74Mb L: 24/25 MS: 1 ChangeByte- 00:08:55.073 [2024-11-27 12:04:23.849791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.073 [2024-11-27 12:04:23.849819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.849873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.073 [2024-11-27 12:04:23.849889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.849944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.073 [2024-11-27 12:04:23.849961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.073 [2024-11-27 12:04:23.850017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.073 [2024-11-27 12:04:23.850034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.073 #43 NEW cov: 12436 ft: 15212 corp: 29/542b lim: 25 exec/s: 43 rss: 74Mb L: 24/25 MS: 1 CopyPart- 00:08:55.073 [2024-11-27 12:04:23.909860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.073 [2024-11-27 12:04:23.909893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.074 [2024-11-27 12:04:23.909931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.074 [2024-11-27 12:04:23.909947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.074 [2024-11-27 12:04:23.910001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.074 [2024-11-27 12:04:23.910016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.074 #44 NEW cov: 12436 ft: 15227 corp: 30/560b lim: 25 exec/s: 44 rss: 74Mb L: 18/25 MS: 1 InsertRepeatedBytes- 00:08:55.074 [2024-11-27 12:04:23.950039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.074 [2024-11-27 12:04:23.950070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.074 [2024-11-27 12:04:23.950122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.074 [2024-11-27 12:04:23.950139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.074 [2024-11-27 12:04:23.950198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.074 [2024-11-27 12:04:23.950215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.074 [2024-11-27 12:04:23.950275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.074 [2024-11-27 12:04:23.950291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.333 #45 NEW cov: 12436 ft: 15271 corp: 31/584b lim: 25 exec/s: 45 rss: 74Mb L: 24/25 MS: 1 ChangeByte- 00:08:55.333 [2024-11-27 12:04:24.009981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.333 [2024-11-27 12:04:24.010010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.010050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.333 [2024-11-27 12:04:24.010066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.333 #50 NEW cov: 12436 ft: 15291 corp: 32/596b lim: 25 exec/s: 50 rss: 74Mb L: 12/25 MS: 5 CrossOver-InsertByte-CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:55.333 [2024-11-27 12:04:24.070438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.333 [2024-11-27 12:04:24.070467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.070520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.333 [2024-11-27 12:04:24.070536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.070593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.333 [2024-11-27 12:04:24.070615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.070673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.333 [2024-11-27 12:04:24.070689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.333 #51 NEW cov: 12436 ft: 15318 corp: 33/618b lim: 25 exec/s: 51 rss: 74Mb L: 22/25 MS: 1 ShuffleBytes- 00:08:55.333 [2024-11-27 12:04:24.110522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.333 [2024-11-27 12:04:24.110550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.110608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.333 [2024-11-27 12:04:24.110622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.110681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.333 [2024-11-27 12:04:24.110697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.110755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.333 [2024-11-27 12:04:24.110772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.333 #52 NEW cov: 12436 ft: 15332 corp: 34/642b lim: 25 exec/s: 52 rss: 74Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:55.333 [2024-11-27 12:04:24.150610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.333 [2024-11-27 12:04:24.150638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.150694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.333 [2024-11-27 12:04:24.150710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.150764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:55.333 [2024-11-27 12:04:24.150780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.150839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:55.333 [2024-11-27 12:04:24.150855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.333 #53 NEW cov: 12436 ft: 15337 corp: 35/665b lim: 25 exec/s: 53 rss: 74Mb L: 23/25 MS: 1 ShuffleBytes- 00:08:55.333 [2024-11-27 12:04:24.190490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:55.333 [2024-11-27 12:04:24.190518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.333 [2024-11-27 12:04:24.190558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:55.333 [2024-11-27 12:04:24.190575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.593 #54 NEW cov: 12436 ft: 15356 corp: 36/679b lim: 25 exec/s: 27 rss: 74Mb L: 14/25 MS: 1 EraseBytes- 00:08:55.593 #54 DONE cov: 12436 ft: 15356 corp: 36/679b lim: 25 exec/s: 27 rss: 74Mb 00:08:55.593 ###### Recommended dictionary. ###### 00:08:55.593 "\002\000\000\000\000\000\000\000" # Uses: 1 00:08:55.593 ###### End of recommended dictionary. ###### 00:08:55.593 Done 54 runs in 2 second(s) 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:55.593 12:04:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:55.593 [2024-11-27 12:04:24.393375] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:55.594 [2024-11-27 12:04:24.393448] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1732892 ] 00:08:55.853 [2024-11-27 12:04:24.569876] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.853 [2024-11-27 12:04:24.591632] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.853 [2024-11-27 12:04:24.643843] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:55.853 [2024-11-27 12:04:24.660206] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:55.853 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.853 INFO: Seed: 2591832613 00:08:55.853 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:55.853 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:55.853 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:55.853 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.853 #2 INITED exec/s: 0 rss: 65Mb 00:08:55.853 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.853 This may also happen if the target rejected all inputs we tried so far 00:08:55.853 [2024-11-27 12:04:24.705085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.853 [2024-11-27 12:04:24.705121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.853 [2024-11-27 12:04:24.705155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.853 [2024-11-27 12:04:24.705173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.853 [2024-11-27 12:04:24.705201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.853 [2024-11-27 12:04:24.705222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.853 [2024-11-27 12:04:24.705251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.853 [2024-11-27 12:04:24.705266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.371 NEW_FUNC[1/716]: 0x4851a8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:56.371 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:56.371 #4 NEW cov: 12281 ft: 12280 corp: 2/81b lim: 100 exec/s: 0 rss: 72Mb L: 80/80 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:56.371 [2024-11-27 12:04:25.055893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.371 [2024-11-27 12:04:25.055931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.371 [2024-11-27 12:04:25.055966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.371 [2024-11-27 12:04:25.055984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.371 #5 NEW cov: 12394 ft: 13250 corp: 3/131b lim: 100 exec/s: 0 rss: 72Mb L: 50/80 MS: 1 CrossOver- 00:08:56.371 [2024-11-27 12:04:25.115949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509728149143569 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.371 [2024-11-27 12:04:25.115984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.371 [2024-11-27 12:04:25.116018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.371 [2024-11-27 12:04:25.116036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.371 [2024-11-27 12:04:25.116066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.371 [2024-11-27 12:04:25.116083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.371 #10 NEW cov: 12400 ft: 13717 corp: 4/192b lim: 100 exec/s: 0 rss: 72Mb L: 61/80 MS: 5 ChangeByte-CrossOver-ShuffleBytes-ChangeBinInt-CrossOver- 00:08:56.371 [2024-11-27 12:04:25.176074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.371 [2024-11-27 12:04:25.176108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.630 #21 NEW cov: 12485 ft: 14946 corp: 5/224b lim: 100 exec/s: 0 rss: 72Mb L: 32/80 MS: 1 EraseBytes- 00:08:56.630 [2024-11-27 12:04:25.276270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.276302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.630 #22 NEW cov: 12485 ft: 15079 corp: 6/256b lim: 100 exec/s: 0 rss: 72Mb L: 32/80 MS: 1 ShuffleBytes- 00:08:56.630 [2024-11-27 12:04:25.366496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.366528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.630 #23 NEW cov: 12485 ft: 15149 corp: 7/288b lim: 100 exec/s: 0 rss: 72Mb L: 32/80 MS: 1 CopyPart- 00:08:56.630 [2024-11-27 12:04:25.426831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.426864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.630 [2024-11-27 12:04:25.426897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.426914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.630 [2024-11-27 12:04:25.426945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.426962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.630 [2024-11-27 12:04:25.426991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.427007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:56.630 #26 NEW cov: 12485 ft: 15249 corp: 8/376b lim: 100 exec/s: 0 rss: 72Mb L: 88/88 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:56.630 [2024-11-27 12:04:25.486848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.486881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.630 [2024-11-27 12:04:25.486916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.630 [2024-11-27 12:04:25.486934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.889 #27 NEW cov: 12485 ft: 15304 corp: 9/426b lim: 100 exec/s: 0 rss: 72Mb L: 50/88 MS: 1 ChangeBit- 00:08:56.889 [2024-11-27 12:04:25.546954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.889 [2024-11-27 12:04:25.546985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.889 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:56.889 #28 NEW cov: 12508 ft: 15350 corp: 10/459b lim: 100 exec/s: 0 rss: 72Mb L: 33/88 MS: 1 InsertByte- 00:08:56.889 [2024-11-27 12:04:25.637353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.889 [2024-11-27 12:04:25.637386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.889 [2024-11-27 12:04:25.637420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.889 [2024-11-27 12:04:25.637437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.889 [2024-11-27 12:04:25.637468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.889 [2024-11-27 12:04:25.637484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.889 #29 NEW cov: 12508 ft: 15436 corp: 11/521b lim: 100 exec/s: 29 rss: 73Mb L: 62/88 MS: 1 EraseBytes- 00:08:56.889 [2024-11-27 12:04:25.727385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.889 [2024-11-27 12:04:25.727416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.148 #30 NEW cov: 12508 ft: 15540 corp: 12/554b lim: 100 exec/s: 30 rss: 73Mb L: 33/88 MS: 1 ShuffleBytes- 00:08:57.148 [2024-11-27 12:04:25.817709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725856869548 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.817740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.148 [2024-11-27 12:04:25.817776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.817794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.148 #33 NEW cov: 12508 ft: 15549 corp: 13/605b lim: 100 exec/s: 33 rss: 73Mb L: 51/88 MS: 3 ChangeByte-ShuffleBytes-CrossOver- 00:08:57.148 [2024-11-27 12:04:25.868034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.868065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.148 [2024-11-27 12:04:25.868098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.868116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.148 [2024-11-27 12:04:25.868147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.868163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.148 [2024-11-27 12:04:25.868192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.868208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.148 [2024-11-27 12:04:25.868237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.868252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:57.148 #34 NEW cov: 12508 ft: 15617 corp: 14/705b lim: 100 exec/s: 34 rss: 73Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:57.148 [2024-11-27 12:04:25.927901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:25.927931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.148 #35 NEW cov: 12508 ft: 15642 corp: 15/728b lim: 100 exec/s: 35 rss: 73Mb L: 23/100 MS: 1 EraseBytes- 00:08:57.148 [2024-11-27 12:04:26.018208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:26.018240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.148 [2024-11-27 12:04:26.018273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.148 [2024-11-27 12:04:26.018290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.407 #36 NEW cov: 12508 ft: 15675 corp: 16/778b lim: 100 exec/s: 36 rss: 73Mb L: 50/100 MS: 1 CopyPart- 00:08:57.407 [2024-11-27 12:04:26.108385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.407 [2024-11-27 12:04:26.108416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.407 #37 NEW cov: 12508 ft: 15704 corp: 17/803b lim: 100 exec/s: 37 rss: 73Mb L: 25/100 MS: 1 EraseBytes- 00:08:57.407 [2024-11-27 12:04:26.158509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.407 [2024-11-27 12:04:26.158540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.407 #38 NEW cov: 12508 ft: 15744 corp: 18/842b lim: 100 exec/s: 38 rss: 73Mb L: 39/100 MS: 1 EraseBytes- 00:08:57.407 [2024-11-27 12:04:26.248935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.407 [2024-11-27 12:04:26.248966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.407 [2024-11-27 12:04:26.248997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.407 [2024-11-27 12:04:26.249015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.408 [2024-11-27 12:04:26.249044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.408 [2024-11-27 12:04:26.249060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.408 [2024-11-27 12:04:26.249088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.408 [2024-11-27 12:04:26.249121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.408 #39 NEW cov: 12508 ft: 15750 corp: 19/930b lim: 100 exec/s: 39 rss: 73Mb L: 88/100 MS: 1 ChangeByte- 00:08:57.667 [2024-11-27 12:04:26.299128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12465681185662348460 len:65279 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.299161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.299195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18374403900871474942 len:65279 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.299213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.299245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12442474543777098924 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.299262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.299291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.299308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.667 #40 NEW cov: 12508 ft: 15765 corp: 20/1014b lim: 100 exec/s: 40 rss: 73Mb L: 84/100 MS: 1 InsertRepeatedBytes- 00:08:57.667 [2024-11-27 12:04:26.389282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4160749568 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.389318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.389352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.389370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.389400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.389417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.667 #45 NEW cov: 12508 ft: 15772 corp: 21/1093b lim: 100 exec/s: 45 rss: 73Mb L: 79/100 MS: 5 CrossOver-EraseBytes-ShuffleBytes-ChangeByte-CrossOver- 00:08:57.667 [2024-11-27 12:04:26.479527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.479558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.479591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.479615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.479662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.479680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.667 [2024-11-27 12:04:26.479709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12465963767163563180 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.667 [2024-11-27 12:04:26.479726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.926 #46 NEW cov: 12508 ft: 15777 corp: 22/1184b lim: 100 exec/s: 46 rss: 73Mb L: 91/100 MS: 1 InsertRepeatedBytes- 00:08:57.926 [2024-11-27 12:04:26.569773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.926 [2024-11-27 12:04:26.569804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.926 [2024-11-27 12:04:26.569837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.926 [2024-11-27 12:04:26.569854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.926 [2024-11-27 12:04:26.569883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.926 [2024-11-27 12:04:26.569899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.926 [2024-11-27 12:04:26.569927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.926 [2024-11-27 12:04:26.569959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.926 #47 NEW cov: 12508 ft: 15786 corp: 23/1265b lim: 100 exec/s: 47 rss: 73Mb L: 81/100 MS: 1 InsertByte- 00:08:57.926 [2024-11-27 12:04:26.629747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.926 [2024-11-27 12:04:26.629777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.927 #48 NEW cov: 12508 ft: 15811 corp: 24/1303b lim: 100 exec/s: 48 rss: 73Mb L: 38/100 MS: 1 EraseBytes- 00:08:57.927 [2024-11-27 12:04:26.689967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12442509725431278764 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.927 [2024-11-27 12:04:26.689999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.927 [2024-11-27 12:04:26.690033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:57.927 [2024-11-27 12:04:26.690050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.927 #49 NEW cov: 12508 ft: 15822 corp: 25/1353b lim: 100 exec/s: 24 rss: 73Mb L: 50/100 MS: 1 ChangeBinInt- 00:08:57.927 #49 DONE cov: 12508 ft: 15822 corp: 25/1353b lim: 100 exec/s: 24 rss: 73Mb 00:08:57.927 Done 49 runs in 2 second(s) 00:08:58.185 12:04:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:58.186 12:04:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:58.186 12:04:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.186 12:04:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:58.186 00:08:58.186 real 1m2.754s 00:08:58.186 user 1m39.327s 00:08:58.186 sys 0m7.113s 00:08:58.186 12:04:26 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:58.186 12:04:26 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:58.186 ************************************ 00:08:58.186 END TEST nvmf_llvm_fuzz 00:08:58.186 ************************************ 00:08:58.186 12:04:26 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:58.186 12:04:26 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:58.186 12:04:26 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:58.186 12:04:26 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:58.186 12:04:26 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:58.186 12:04:26 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:58.186 ************************************ 00:08:58.186 START TEST vfio_llvm_fuzz 00:08:58.186 ************************************ 00:08:58.186 12:04:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:58.186 * Looking for test storage... 00:08:58.186 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:58.186 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:58.186 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:58.186 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:58.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.448 --rc genhtml_branch_coverage=1 00:08:58.448 --rc genhtml_function_coverage=1 00:08:58.448 --rc genhtml_legend=1 00:08:58.448 --rc geninfo_all_blocks=1 00:08:58.448 --rc geninfo_unexecuted_blocks=1 00:08:58.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.448 ' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:58.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.448 --rc genhtml_branch_coverage=1 00:08:58.448 --rc genhtml_function_coverage=1 00:08:58.448 --rc genhtml_legend=1 00:08:58.448 --rc geninfo_all_blocks=1 00:08:58.448 --rc geninfo_unexecuted_blocks=1 00:08:58.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.448 ' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:58.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.448 --rc genhtml_branch_coverage=1 00:08:58.448 --rc genhtml_function_coverage=1 00:08:58.448 --rc genhtml_legend=1 00:08:58.448 --rc geninfo_all_blocks=1 00:08:58.448 --rc geninfo_unexecuted_blocks=1 00:08:58.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.448 ' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:58.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.448 --rc genhtml_branch_coverage=1 00:08:58.448 --rc genhtml_function_coverage=1 00:08:58.448 --rc genhtml_legend=1 00:08:58.448 --rc geninfo_all_blocks=1 00:08:58.448 --rc geninfo_unexecuted_blocks=1 00:08:58.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.448 ' 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:58.448 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:58.449 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:58.449 #define SPDK_CONFIG_H 00:08:58.449 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:58.449 #define SPDK_CONFIG_APPS 1 00:08:58.449 #define SPDK_CONFIG_ARCH native 00:08:58.449 #undef SPDK_CONFIG_ASAN 00:08:58.449 #undef SPDK_CONFIG_AVAHI 00:08:58.449 #undef SPDK_CONFIG_CET 00:08:58.449 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:58.449 #define SPDK_CONFIG_COVERAGE 1 00:08:58.449 #define SPDK_CONFIG_CROSS_PREFIX 00:08:58.449 #undef SPDK_CONFIG_CRYPTO 00:08:58.449 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:58.449 #undef SPDK_CONFIG_CUSTOMOCF 00:08:58.450 #undef SPDK_CONFIG_DAOS 00:08:58.450 #define SPDK_CONFIG_DAOS_DIR 00:08:58.450 #define SPDK_CONFIG_DEBUG 1 00:08:58.450 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:58.450 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:58.450 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:58.450 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:58.450 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:58.450 #undef SPDK_CONFIG_DPDK_UADK 00:08:58.450 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:58.450 #define SPDK_CONFIG_EXAMPLES 1 00:08:58.450 #undef SPDK_CONFIG_FC 00:08:58.450 #define SPDK_CONFIG_FC_PATH 00:08:58.450 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:58.450 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:58.450 #define SPDK_CONFIG_FSDEV 1 00:08:58.450 #undef SPDK_CONFIG_FUSE 00:08:58.450 #define SPDK_CONFIG_FUZZER 1 00:08:58.450 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:58.450 #undef SPDK_CONFIG_GOLANG 00:08:58.450 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:58.450 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:58.450 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:58.450 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:58.450 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:58.450 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:58.450 #undef SPDK_CONFIG_HAVE_LZ4 00:08:58.450 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:58.450 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:58.450 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:58.450 #define SPDK_CONFIG_IDXD 1 00:08:58.450 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:58.450 #undef SPDK_CONFIG_IPSEC_MB 00:08:58.450 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:58.450 #define SPDK_CONFIG_ISAL 1 00:08:58.450 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:58.450 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:58.450 #define SPDK_CONFIG_LIBDIR 00:08:58.450 #undef SPDK_CONFIG_LTO 00:08:58.450 #define SPDK_CONFIG_MAX_LCORES 128 00:08:58.450 #define SPDK_CONFIG_NVME_CUSE 1 00:08:58.450 #undef SPDK_CONFIG_OCF 00:08:58.450 #define SPDK_CONFIG_OCF_PATH 00:08:58.450 #define SPDK_CONFIG_OPENSSL_PATH 00:08:58.450 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:58.450 #define SPDK_CONFIG_PGO_DIR 00:08:58.450 #undef SPDK_CONFIG_PGO_USE 00:08:58.450 #define SPDK_CONFIG_PREFIX /usr/local 00:08:58.450 #undef SPDK_CONFIG_RAID5F 00:08:58.450 #undef SPDK_CONFIG_RBD 00:08:58.450 #define SPDK_CONFIG_RDMA 1 00:08:58.450 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:58.450 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:58.450 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:58.450 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:58.450 #undef SPDK_CONFIG_SHARED 00:08:58.450 #undef SPDK_CONFIG_SMA 00:08:58.450 #define SPDK_CONFIG_TESTS 1 00:08:58.450 #undef SPDK_CONFIG_TSAN 00:08:58.450 #define SPDK_CONFIG_UBLK 1 00:08:58.450 #define SPDK_CONFIG_UBSAN 1 00:08:58.450 #undef SPDK_CONFIG_UNIT_TESTS 00:08:58.450 #undef SPDK_CONFIG_URING 00:08:58.450 #define SPDK_CONFIG_URING_PATH 00:08:58.450 #undef SPDK_CONFIG_URING_ZNS 00:08:58.450 #undef SPDK_CONFIG_USDT 00:08:58.450 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:58.450 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:58.450 #define SPDK_CONFIG_VFIO_USER 1 00:08:58.450 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:58.450 #define SPDK_CONFIG_VHOST 1 00:08:58.450 #define SPDK_CONFIG_VIRTIO 1 00:08:58.450 #undef SPDK_CONFIG_VTUNE 00:08:58.450 #define SPDK_CONFIG_VTUNE_DIR 00:08:58.450 #define SPDK_CONFIG_WERROR 1 00:08:58.450 #define SPDK_CONFIG_WPDK_DIR 00:08:58.450 #undef SPDK_CONFIG_XNVME 00:08:58.450 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:58.450 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:58.451 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 1733353 ]] 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 1733353 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.LaVBaQ 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.LaVBaQ/tests/vfio /tmp/spdk.LaVBaQ 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=51249405952 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730607104 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=10481201152 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30860537856 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865301504 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4763648 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340129792 00:08:58.452 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5992448 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30863654912 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=1650688 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:08:58.453 * Looking for test storage... 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=51249405952 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=12695793664 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:58.453 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:58.453 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:58.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.713 --rc genhtml_branch_coverage=1 00:08:58.713 --rc genhtml_function_coverage=1 00:08:58.713 --rc genhtml_legend=1 00:08:58.713 --rc geninfo_all_blocks=1 00:08:58.713 --rc geninfo_unexecuted_blocks=1 00:08:58.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.713 ' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:58.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.713 --rc genhtml_branch_coverage=1 00:08:58.713 --rc genhtml_function_coverage=1 00:08:58.713 --rc genhtml_legend=1 00:08:58.713 --rc geninfo_all_blocks=1 00:08:58.713 --rc geninfo_unexecuted_blocks=1 00:08:58.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.713 ' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:58.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.713 --rc genhtml_branch_coverage=1 00:08:58.713 --rc genhtml_function_coverage=1 00:08:58.713 --rc genhtml_legend=1 00:08:58.713 --rc geninfo_all_blocks=1 00:08:58.713 --rc geninfo_unexecuted_blocks=1 00:08:58.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.713 ' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:58.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:58.713 --rc genhtml_branch_coverage=1 00:08:58.713 --rc genhtml_function_coverage=1 00:08:58.713 --rc genhtml_legend=1 00:08:58.713 --rc geninfo_all_blocks=1 00:08:58.713 --rc geninfo_unexecuted_blocks=1 00:08:58.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:58.713 ' 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:58.713 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:58.713 12:04:27 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:58.713 [2024-11-27 12:04:27.440544] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:58.713 [2024-11-27 12:04:27.440628] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1733514 ] 00:08:58.713 [2024-11-27 12:04:27.515446] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.714 [2024-11-27 12:04:27.558837] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.973 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.973 INFO: Seed: 1359844303 00:08:58.973 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:58.973 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:58.973 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:58.973 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.973 #2 INITED exec/s: 0 rss: 66Mb 00:08:58.973 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.973 This may also happen if the target rejected all inputs we tried so far 00:08:58.973 [2024-11-27 12:04:27.795411] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:59.491 NEW_FUNC[1/668]: 0x459068 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:59.491 NEW_FUNC[2/668]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:59.491 #44 NEW cov: 11086 ft: 11029 corp: 2/7b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:59.491 #45 NEW cov: 11100 ft: 13793 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:59.750 #50 NEW cov: 11110 ft: 15265 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 5 InsertByte-InsertRepeatedBytes-ShuffleBytes-CrossOver-InsertByte- 00:08:59.750 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:59.750 #56 NEW cov: 11127 ft: 15687 corp: 5/25b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:09:00.009 #57 NEW cov: 11127 ft: 15730 corp: 6/31b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:09:00.009 #58 NEW cov: 11127 ft: 16227 corp: 7/37b lim: 6 exec/s: 58 rss: 74Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:00.274 #59 NEW cov: 11127 ft: 16548 corp: 8/43b lim: 6 exec/s: 59 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:09:00.274 #61 NEW cov: 11127 ft: 16832 corp: 9/49b lim: 6 exec/s: 61 rss: 74Mb L: 6/6 MS: 2 EraseBytes-InsertByte- 00:09:00.533 #62 NEW cov: 11127 ft: 16988 corp: 10/55b lim: 6 exec/s: 62 rss: 75Mb L: 6/6 MS: 1 ShuffleBytes- 00:09:00.533 #63 NEW cov: 11127 ft: 17170 corp: 11/61b lim: 6 exec/s: 63 rss: 75Mb L: 6/6 MS: 1 ShuffleBytes- 00:09:00.791 #64 NEW cov: 11127 ft: 17237 corp: 12/67b lim: 6 exec/s: 64 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:09:00.791 #70 NEW cov: 11134 ft: 17420 corp: 13/73b lim: 6 exec/s: 70 rss: 75Mb L: 6/6 MS: 1 ChangeByte- 00:09:01.050 #71 NEW cov: 11134 ft: 17434 corp: 14/79b lim: 6 exec/s: 71 rss: 75Mb L: 6/6 MS: 1 ShuffleBytes- 00:09:01.050 #72 NEW cov: 11134 ft: 17661 corp: 15/85b lim: 6 exec/s: 36 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:01.050 #72 DONE cov: 11134 ft: 17661 corp: 15/85b lim: 6 exec/s: 36 rss: 75Mb 00:09:01.050 Done 72 runs in 2 second(s) 00:09:01.050 [2024-11-27 12:04:29.833781] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:01.309 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:01.309 12:04:30 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:01.309 [2024-11-27 12:04:30.129149] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:09:01.309 [2024-11-27 12:04:30.129219] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1733949 ] 00:09:01.567 [2024-11-27 12:04:30.202283] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.567 [2024-11-27 12:04:30.242041] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.567 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.567 INFO: Seed: 4050867265 00:09:01.875 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:09:01.875 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:09:01.875 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:01.875 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.875 #2 INITED exec/s: 0 rss: 66Mb 00:09:01.875 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.875 This may also happen if the target rejected all inputs we tried so far 00:09:01.875 [2024-11-27 12:04:30.494350] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:09:01.875 [2024-11-27 12:04:30.537657] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.875 [2024-11-27 12:04:30.537701] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.875 [2024-11-27 12:04:30.537735] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.133 NEW_FUNC[1/670]: 0x459608 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:09:02.133 NEW_FUNC[2/670]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:02.133 #106 NEW cov: 11078 ft: 11039 corp: 2/5b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 4 InsertByte-ShuffleBytes-CopyPart-InsertByte- 00:09:02.133 [2024-11-27 12:04:31.014463] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.133 [2024-11-27 12:04:31.014501] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.133 [2024-11-27 12:04:31.014519] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.391 #112 NEW cov: 11092 ft: 14441 corp: 3/9b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 1 CrossOver- 00:09:02.391 [2024-11-27 12:04:31.210800] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.391 [2024-11-27 12:04:31.210826] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.391 [2024-11-27 12:04:31.210845] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.650 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:09:02.650 #113 NEW cov: 11109 ft: 14832 corp: 4/13b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:09:02.650 [2024-11-27 12:04:31.396718] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.650 [2024-11-27 12:04:31.396743] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.650 [2024-11-27 12:04:31.396760] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.650 #116 NEW cov: 11109 ft: 15503 corp: 5/17b lim: 4 exec/s: 116 rss: 74Mb L: 4/4 MS: 3 CrossOver-ChangeBinInt-CopyPart- 00:09:02.907 [2024-11-27 12:04:31.579501] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.907 [2024-11-27 12:04:31.579525] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.907 [2024-11-27 12:04:31.579543] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:02.907 #125 NEW cov: 11109 ft: 16520 corp: 6/21b lim: 4 exec/s: 125 rss: 74Mb L: 4/4 MS: 4 CrossOver-ChangeBit-ChangeASCIIInt-InsertByte- 00:09:02.907 [2024-11-27 12:04:31.763051] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:02.907 [2024-11-27 12:04:31.763075] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:02.907 [2024-11-27 12:04:31.763092] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.165 #126 NEW cov: 11109 ft: 16688 corp: 7/25b lim: 4 exec/s: 126 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:09:03.165 [2024-11-27 12:04:31.946671] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:03.165 [2024-11-27 12:04:31.946695] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:03.165 [2024-11-27 12:04:31.946714] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.424 #127 NEW cov: 11109 ft: 16744 corp: 8/29b lim: 4 exec/s: 127 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:09:03.424 [2024-11-27 12:04:32.131484] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:03.424 [2024-11-27 12:04:32.131507] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:03.424 [2024-11-27 12:04:32.131528] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.424 #128 NEW cov: 11109 ft: 16925 corp: 9/33b lim: 4 exec/s: 128 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:09:03.682 [2024-11-27 12:04:32.312956] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:03.682 [2024-11-27 12:04:32.312979] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:03.682 [2024-11-27 12:04:32.312996] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.682 #129 NEW cov: 11116 ft: 17054 corp: 10/37b lim: 4 exec/s: 129 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:09:03.682 [2024-11-27 12:04:32.494806] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:03.682 [2024-11-27 12:04:32.494829] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:03.682 [2024-11-27 12:04:32.494846] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:03.941 #130 NEW cov: 11116 ft: 17390 corp: 11/41b lim: 4 exec/s: 65 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:09:03.941 #130 DONE cov: 11116 ft: 17390 corp: 11/41b lim: 4 exec/s: 65 rss: 74Mb 00:09:03.941 Done 130 runs in 2 second(s) 00:09:03.941 [2024-11-27 12:04:32.626783] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:04.200 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:04.201 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:04.201 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:04.201 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:04.201 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:04.201 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:04.201 12:04:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:04.201 [2024-11-27 12:04:32.916426] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:09:04.201 [2024-11-27 12:04:32.916510] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1734342 ] 00:09:04.201 [2024-11-27 12:04:32.988466] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.201 [2024-11-27 12:04:33.027145] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.459 INFO: Running with entropic power schedule (0xFF, 100). 00:09:04.459 INFO: Seed: 2533421468 00:09:04.459 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:09:04.459 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:09:04.459 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:04.459 INFO: A corpus is not provided, starting from an empty corpus 00:09:04.459 #2 INITED exec/s: 0 rss: 66Mb 00:09:04.459 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:04.459 This may also happen if the target rejected all inputs we tried so far 00:09:04.459 [2024-11-27 12:04:33.265951] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:04.459 [2024-11-27 12:04:33.318235] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:04.977 NEW_FUNC[1/667]: 0x459ff8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:04.977 NEW_FUNC[2/667]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:04.977 #51 NEW cov: 10970 ft: 11013 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 4 CopyPart-ChangeByte-CMP-InsertRepeatedBytes- DE: "\377\377"- 00:09:04.977 [2024-11-27 12:04:33.825333] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:05.236 NEW_FUNC[1/2]: 0x15bf2c8 in handle_sq_tdbl_write /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2551 00:09:05.236 NEW_FUNC[2/2]: 0x18cde28 in nvme_pcie_qpair_build_contig_hw_sgl_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_pcie_common.c:1307 00:09:05.236 #52 NEW cov: 11079 ft: 14285 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:05.236 [2024-11-27 12:04:34.062121] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:05.495 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:09:05.495 #58 NEW cov: 11096 ft: 15164 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:05.495 [2024-11-27 12:04:34.267698] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:05.754 #59 NEW cov: 11096 ft: 16641 corp: 5/33b lim: 8 exec/s: 59 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:09:05.754 [2024-11-27 12:04:34.486638] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:05.754 #62 NEW cov: 11096 ft: 16951 corp: 6/41b lim: 8 exec/s: 62 rss: 74Mb L: 8/8 MS: 3 EraseBytes-PersAutoDict-CrossOver- DE: "\377\377"- 00:09:06.012 [2024-11-27 12:04:34.701876] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:06.012 #63 NEW cov: 11096 ft: 17374 corp: 7/49b lim: 8 exec/s: 63 rss: 75Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:06.271 [2024-11-27 12:04:34.907173] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:06.271 #64 NEW cov: 11096 ft: 17605 corp: 8/57b lim: 8 exec/s: 64 rss: 75Mb L: 8/8 MS: 1 ChangeBit- 00:09:06.271 [2024-11-27 12:04:35.116963] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:06.530 #65 NEW cov: 11103 ft: 17762 corp: 9/65b lim: 8 exec/s: 32 rss: 75Mb L: 8/8 MS: 1 ChangeBit- 00:09:06.530 #65 DONE cov: 11103 ft: 17762 corp: 9/65b lim: 8 exec/s: 32 rss: 75Mb 00:09:06.530 ###### Recommended dictionary. ###### 00:09:06.530 "\377\377" # Uses: 1 00:09:06.530 ###### End of recommended dictionary. ###### 00:09:06.530 Done 65 runs in 2 second(s) 00:09:06.530 [2024-11-27 12:04:35.258791] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:06.789 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:06.789 12:04:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:06.789 [2024-11-27 12:04:35.545790] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:09:06.789 [2024-11-27 12:04:35.545878] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1734880 ] 00:09:06.789 [2024-11-27 12:04:35.616498] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.789 [2024-11-27 12:04:35.654239] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.048 INFO: Running with entropic power schedule (0xFF, 100). 00:09:07.048 INFO: Seed: 865924511 00:09:07.048 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:09:07.048 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:09:07.048 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:07.048 INFO: A corpus is not provided, starting from an empty corpus 00:09:07.048 #2 INITED exec/s: 0 rss: 66Mb 00:09:07.048 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:07.048 This may also happen if the target rejected all inputs we tried so far 00:09:07.048 [2024-11-27 12:04:35.897618] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:07.566 NEW_FUNC[1/669]: 0x45a6e8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:07.566 NEW_FUNC[2/669]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:07.566 #97 NEW cov: 11069 ft: 11033 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 5 ShuffleBytes-InsertRepeatedBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:07.824 #98 NEW cov: 11083 ft: 13962 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:08.084 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:09:08.084 #99 NEW cov: 11100 ft: 15430 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:08.084 #100 NEW cov: 11100 ft: 16538 corp: 5/129b lim: 32 exec/s: 100 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:09:08.343 #101 NEW cov: 11100 ft: 16950 corp: 6/161b lim: 32 exec/s: 101 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:08.601 #102 NEW cov: 11100 ft: 17293 corp: 7/193b lim: 32 exec/s: 102 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:08.860 #103 NEW cov: 11100 ft: 17397 corp: 8/225b lim: 32 exec/s: 103 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:08.860 #104 NEW cov: 11107 ft: 17697 corp: 9/257b lim: 32 exec/s: 104 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:09.119 #105 NEW cov: 11107 ft: 17766 corp: 10/289b lim: 32 exec/s: 52 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:09.119 #105 DONE cov: 11107 ft: 17766 corp: 10/289b lim: 32 exec/s: 52 rss: 75Mb 00:09:09.119 Done 105 runs in 2 second(s) 00:09:09.119 [2024-11-27 12:04:37.956790] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:09.378 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:09.378 12:04:38 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:09.378 [2024-11-27 12:04:38.248897] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:09:09.378 [2024-11-27 12:04:38.248990] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735410 ] 00:09:09.638 [2024-11-27 12:04:38.320369] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.638 [2024-11-27 12:04:38.358429] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.897 INFO: Running with entropic power schedule (0xFF, 100). 00:09:09.897 INFO: Seed: 3567905480 00:09:09.897 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:09:09.897 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:09:09.897 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:09.897 INFO: A corpus is not provided, starting from an empty corpus 00:09:09.897 #2 INITED exec/s: 0 rss: 66Mb 00:09:09.897 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:09.897 This may also happen if the target rejected all inputs we tried so far 00:09:09.897 [2024-11-27 12:04:38.588179] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:10.156 NEW_FUNC[1/669]: 0x45af68 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:10.156 NEW_FUNC[2/669]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:10.156 #19 NEW cov: 11066 ft: 11038 corp: 2/33b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 2 InsertRepeatedBytes-InsertByte- 00:09:10.415 #25 NEW cov: 11089 ft: 14085 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:10.674 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:09:10.674 #51 NEW cov: 11106 ft: 14720 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:10.932 #57 NEW cov: 11106 ft: 15127 corp: 5/129b lim: 32 exec/s: 57 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:10.932 #58 NEW cov: 11106 ft: 15540 corp: 6/161b lim: 32 exec/s: 58 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:11.191 #59 NEW cov: 11106 ft: 15816 corp: 7/193b lim: 32 exec/s: 59 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:11.450 #65 NEW cov: 11106 ft: 17495 corp: 8/225b lim: 32 exec/s: 65 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:09:11.709 #66 NEW cov: 11106 ft: 18026 corp: 9/257b lim: 32 exec/s: 66 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:09:11.709 #72 NEW cov: 11106 ft: 18102 corp: 10/289b lim: 32 exec/s: 36 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:11.709 #72 DONE cov: 11106 ft: 18102 corp: 10/289b lim: 32 exec/s: 36 rss: 74Mb 00:09:11.709 Done 72 runs in 2 second(s) 00:09:11.709 [2024-11-27 12:04:40.574792] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:11.969 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:11.969 12:04:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:12.228 [2024-11-27 12:04:40.862901] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:09:12.228 [2024-11-27 12:04:40.862979] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735780 ] 00:09:12.228 [2024-11-27 12:04:40.934894] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.228 [2024-11-27 12:04:40.973404] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.488 INFO: Running with entropic power schedule (0xFF, 100). 00:09:12.488 INFO: Seed: 1895950475 00:09:12.488 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:09:12.488 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:09:12.488 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:12.488 INFO: A corpus is not provided, starting from an empty corpus 00:09:12.488 #2 INITED exec/s: 0 rss: 66Mb 00:09:12.488 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:12.488 This may also happen if the target rejected all inputs we tried so far 00:09:12.488 [2024-11-27 12:04:41.217975] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:12.488 [2024-11-27 12:04:41.275682] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.488 [2024-11-27 12:04:41.275719] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.005 NEW_FUNC[1/670]: 0x45b968 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:13.005 NEW_FUNC[2/670]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:13.005 #53 NEW cov: 11079 ft: 11039 corp: 2/14b lim: 13 exec/s: 0 rss: 71Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:09:13.005 [2024-11-27 12:04:41.749157] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.005 [2024-11-27 12:04:41.749200] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.005 #69 NEW cov: 11095 ft: 13782 corp: 3/27b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 CopyPart- 00:09:13.264 [2024-11-27 12:04:41.942489] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.264 [2024-11-27 12:04:41.942520] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.264 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:09:13.264 #75 NEW cov: 11112 ft: 15405 corp: 4/40b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 CopyPart- 00:09:13.264 [2024-11-27 12:04:42.134513] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.264 [2024-11-27 12:04:42.134544] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.523 #76 NEW cov: 11115 ft: 15922 corp: 5/53b lim: 13 exec/s: 76 rss: 73Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:13.523 [2024-11-27 12:04:42.338858] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.523 [2024-11-27 12:04:42.338893] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.782 #77 NEW cov: 11115 ft: 16023 corp: 6/66b lim: 13 exec/s: 77 rss: 73Mb L: 13/13 MS: 1 CrossOver- 00:09:13.782 [2024-11-27 12:04:42.534252] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.782 [2024-11-27 12:04:42.534284] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.782 #83 NEW cov: 11115 ft: 16219 corp: 7/79b lim: 13 exec/s: 83 rss: 73Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:14.041 [2024-11-27 12:04:42.726077] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.041 [2024-11-27 12:04:42.726108] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.041 #84 NEW cov: 11115 ft: 16850 corp: 8/92b lim: 13 exec/s: 84 rss: 73Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:14.041 [2024-11-27 12:04:42.917883] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.041 [2024-11-27 12:04:42.917914] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.299 #85 NEW cov: 11122 ft: 17455 corp: 9/105b lim: 13 exec/s: 85 rss: 73Mb L: 13/13 MS: 1 ChangeBit- 00:09:14.299 [2024-11-27 12:04:43.105093] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.299 [2024-11-27 12:04:43.105124] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.558 #86 NEW cov: 11122 ft: 17746 corp: 10/118b lim: 13 exec/s: 43 rss: 73Mb L: 13/13 MS: 1 CrossOver- 00:09:14.558 #86 DONE cov: 11122 ft: 17746 corp: 10/118b lim: 13 exec/s: 43 rss: 73Mb 00:09:14.558 Done 86 runs in 2 second(s) 00:09:14.558 [2024-11-27 12:04:43.238781] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:14.818 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:14.818 12:04:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:14.818 [2024-11-27 12:04:43.527583] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:09:14.818 [2024-11-27 12:04:43.527683] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736240 ] 00:09:14.818 [2024-11-27 12:04:43.600149] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.818 [2024-11-27 12:04:43.638017] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.077 INFO: Running with entropic power schedule (0xFF, 100). 00:09:15.077 INFO: Seed: 262977146 00:09:15.077 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:09:15.077 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:09:15.077 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:15.077 INFO: A corpus is not provided, starting from an empty corpus 00:09:15.077 #2 INITED exec/s: 0 rss: 66Mb 00:09:15.077 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:15.077 This may also happen if the target rejected all inputs we tried so far 00:09:15.077 [2024-11-27 12:04:43.873024] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:15.077 [2024-11-27 12:04:43.930626] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.077 [2024-11-27 12:04:43.930661] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.595 NEW_FUNC[1/670]: 0x45c658 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:15.595 NEW_FUNC[2/670]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:15.595 #16 NEW cov: 11071 ft: 10986 corp: 2/10b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 4 InsertRepeatedBytes-InsertByte-ShuffleBytes-InsertByte- 00:09:15.595 [2024-11-27 12:04:44.418515] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.595 [2024-11-27 12:04:44.418555] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.854 #32 NEW cov: 11089 ft: 13895 corp: 3/19b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:09:15.854 [2024-11-27 12:04:44.614053] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.854 [2024-11-27 12:04:44.614086] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.854 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:09:15.854 #33 NEW cov: 11106 ft: 15667 corp: 4/28b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ChangeBit- 00:09:16.112 [2024-11-27 12:04:44.809916] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.112 [2024-11-27 12:04:44.809947] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.112 #39 NEW cov: 11106 ft: 16058 corp: 5/37b lim: 9 exec/s: 39 rss: 74Mb L: 9/9 MS: 1 CMP- DE: "\377\377\377\377\377\377\377v"- 00:09:16.370 [2024-11-27 12:04:45.015050] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.370 [2024-11-27 12:04:45.015082] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.370 #40 NEW cov: 11106 ft: 16658 corp: 6/46b lim: 9 exec/s: 40 rss: 74Mb L: 9/9 MS: 1 ChangeByte- 00:09:16.370 [2024-11-27 12:04:45.210853] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.370 [2024-11-27 12:04:45.210883] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.628 #41 NEW cov: 11106 ft: 17379 corp: 7/55b lim: 9 exec/s: 41 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:09:16.628 [2024-11-27 12:04:45.406001] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.628 [2024-11-27 12:04:45.406034] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.887 #45 NEW cov: 11106 ft: 17574 corp: 8/64b lim: 9 exec/s: 45 rss: 74Mb L: 9/9 MS: 4 EraseBytes-ChangeBit-CrossOver-CrossOver- 00:09:16.887 [2024-11-27 12:04:45.598685] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:16.887 [2024-11-27 12:04:45.598719] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:16.887 #46 NEW cov: 11113 ft: 17827 corp: 9/73b lim: 9 exec/s: 46 rss: 74Mb L: 9/9 MS: 1 ChangeByte- 00:09:17.146 [2024-11-27 12:04:45.796015] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:17.146 [2024-11-27 12:04:45.796046] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:17.146 #47 NEW cov: 11113 ft: 18169 corp: 10/82b lim: 9 exec/s: 23 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:17.146 #47 DONE cov: 11113 ft: 18169 corp: 10/82b lim: 9 exec/s: 23 rss: 74Mb 00:09:17.146 ###### Recommended dictionary. ###### 00:09:17.146 "\377\377\377\377\377\377\377v" # Uses: 0 00:09:17.146 ###### End of recommended dictionary. ###### 00:09:17.146 Done 47 runs in 2 second(s) 00:09:17.146 [2024-11-27 12:04:45.935801] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:17.405 12:04:46 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:17.405 12:04:46 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:17.405 12:04:46 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:17.405 12:04:46 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:17.405 00:09:17.405 real 0m19.257s 00:09:17.405 user 0m26.823s 00:09:17.405 sys 0m1.979s 00:09:17.405 12:04:46 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:17.405 12:04:46 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:17.405 ************************************ 00:09:17.405 END TEST vfio_llvm_fuzz 00:09:17.405 ************************************ 00:09:17.405 00:09:17.405 real 1m22.363s 00:09:17.405 user 2m6.304s 00:09:17.405 sys 0m9.319s 00:09:17.405 12:04:46 llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:17.405 12:04:46 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:17.405 ************************************ 00:09:17.405 END TEST llvm_fuzz 00:09:17.405 ************************************ 00:09:17.405 12:04:46 -- spdk/autotest.sh@374 -- # [[ '' -eq 1 ]] 00:09:17.405 12:04:46 -- spdk/autotest.sh@381 -- # trap - SIGINT SIGTERM EXIT 00:09:17.405 12:04:46 -- spdk/autotest.sh@383 -- # timing_enter post_cleanup 00:09:17.405 12:04:46 -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:17.405 12:04:46 -- common/autotest_common.sh@10 -- # set +x 00:09:17.405 12:04:46 -- spdk/autotest.sh@384 -- # autotest_cleanup 00:09:17.405 12:04:46 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:09:17.405 12:04:46 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:09:17.405 12:04:46 -- common/autotest_common.sh@10 -- # set +x 00:09:24.087 INFO: APP EXITING 00:09:24.087 INFO: killing all VMs 00:09:24.087 INFO: killing vhost app 00:09:24.087 INFO: EXIT DONE 00:09:25.990 Waiting for block devices as requested 00:09:25.990 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:25.990 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:25.990 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:25.990 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:26.249 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:26.249 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:26.249 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:26.508 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:26.508 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:26.508 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:26.508 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:26.767 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:26.767 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:26.767 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:27.025 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:27.025 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:27.025 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:31.213 Cleaning 00:09:31.213 Removing: /dev/shm/spdk_tgt_trace.pid1708512 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1706051 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1707173 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1708512 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1708978 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1709935 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1710083 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1711194 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1711204 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1711636 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1711962 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1712285 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1712455 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1712702 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1712993 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1713273 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1713591 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1714406 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1717504 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1717967 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1718491 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1718507 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1719073 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1719078 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1719649 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1719727 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1720091 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1720210 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1720372 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1720510 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1720896 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1721180 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1721462 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1721679 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1722297 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1722753 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1723120 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1723651 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1724000 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1724472 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1725001 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1725298 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1725826 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1726287 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1726648 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1727182 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1727486 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1728000 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1728487 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1728822 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1729353 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1729735 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1730173 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1730706 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1730995 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1731527 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1731992 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1732363 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1732892 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1733514 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1733949 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1734342 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1734880 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1735410 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1735780 00:09:31.213 Removing: /var/run/dpdk/spdk_pid1736240 00:09:31.213 Clean 00:09:31.213 12:04:59 -- common/autotest_common.sh@1451 -- # return 0 00:09:31.213 12:04:59 -- spdk/autotest.sh@385 -- # timing_exit post_cleanup 00:09:31.213 12:04:59 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:31.213 12:04:59 -- common/autotest_common.sh@10 -- # set +x 00:09:31.213 12:04:59 -- spdk/autotest.sh@387 -- # timing_exit autotest 00:09:31.213 12:04:59 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:31.213 12:04:59 -- common/autotest_common.sh@10 -- # set +x 00:09:31.213 12:04:59 -- spdk/autotest.sh@388 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:31.213 12:04:59 -- spdk/autotest.sh@390 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:31.213 12:04:59 -- spdk/autotest.sh@390 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:31.213 12:04:59 -- spdk/autotest.sh@392 -- # [[ y == y ]] 00:09:31.213 12:04:59 -- spdk/autotest.sh@394 -- # hostname 00:09:31.213 12:04:59 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:31.471 geninfo: WARNING: invalid characters removed from testname! 00:09:34.754 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:40.024 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:43.309 12:05:12 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:51.425 12:05:19 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:56.693 12:05:24 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:01.957 12:05:29 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:07.228 12:05:35 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:11.423 12:05:40 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:16.689 12:05:45 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:16.689 12:05:45 -- common/autotest_common.sh@1680 -- $ [[ y == y ]] 00:10:16.689 12:05:45 -- common/autotest_common.sh@1681 -- $ lcov --version 00:10:16.689 12:05:45 -- common/autotest_common.sh@1681 -- $ awk '{print $NF}' 00:10:16.689 12:05:45 -- common/autotest_common.sh@1681 -- $ lt 1.15 2 00:10:16.689 12:05:45 -- scripts/common.sh@373 -- $ cmp_versions 1.15 '<' 2 00:10:16.689 12:05:45 -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:10:16.689 12:05:45 -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:10:16.689 12:05:45 -- scripts/common.sh@336 -- $ IFS=.-: 00:10:16.689 12:05:45 -- scripts/common.sh@336 -- $ read -ra ver1 00:10:16.689 12:05:45 -- scripts/common.sh@337 -- $ IFS=.-: 00:10:16.689 12:05:45 -- scripts/common.sh@337 -- $ read -ra ver2 00:10:16.689 12:05:45 -- scripts/common.sh@338 -- $ local 'op=<' 00:10:16.689 12:05:45 -- scripts/common.sh@340 -- $ ver1_l=2 00:10:16.689 12:05:45 -- scripts/common.sh@341 -- $ ver2_l=1 00:10:16.689 12:05:45 -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:10:16.689 12:05:45 -- scripts/common.sh@344 -- $ case "$op" in 00:10:16.689 12:05:45 -- scripts/common.sh@345 -- $ : 1 00:10:16.689 12:05:45 -- scripts/common.sh@364 -- $ (( v = 0 )) 00:10:16.689 12:05:45 -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:16.689 12:05:45 -- scripts/common.sh@365 -- $ decimal 1 00:10:16.689 12:05:45 -- scripts/common.sh@353 -- $ local d=1 00:10:16.689 12:05:45 -- scripts/common.sh@354 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:10:16.689 12:05:45 -- scripts/common.sh@355 -- $ echo 1 00:10:16.689 12:05:45 -- scripts/common.sh@365 -- $ ver1[v]=1 00:10:16.689 12:05:45 -- scripts/common.sh@366 -- $ decimal 2 00:10:16.689 12:05:45 -- scripts/common.sh@353 -- $ local d=2 00:10:16.689 12:05:45 -- scripts/common.sh@354 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:10:16.689 12:05:45 -- scripts/common.sh@355 -- $ echo 2 00:10:16.689 12:05:45 -- scripts/common.sh@366 -- $ ver2[v]=2 00:10:16.689 12:05:45 -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:10:16.689 12:05:45 -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:10:16.689 12:05:45 -- scripts/common.sh@368 -- $ return 0 00:10:16.689 12:05:45 -- common/autotest_common.sh@1682 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:16.689 12:05:45 -- common/autotest_common.sh@1694 -- $ export 'LCOV_OPTS= 00:10:16.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.689 --rc genhtml_branch_coverage=1 00:10:16.689 --rc genhtml_function_coverage=1 00:10:16.689 --rc genhtml_legend=1 00:10:16.689 --rc geninfo_all_blocks=1 00:10:16.689 --rc geninfo_unexecuted_blocks=1 00:10:16.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.689 ' 00:10:16.689 12:05:45 -- common/autotest_common.sh@1694 -- $ LCOV_OPTS=' 00:10:16.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.689 --rc genhtml_branch_coverage=1 00:10:16.689 --rc genhtml_function_coverage=1 00:10:16.689 --rc genhtml_legend=1 00:10:16.689 --rc geninfo_all_blocks=1 00:10:16.689 --rc geninfo_unexecuted_blocks=1 00:10:16.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.689 ' 00:10:16.689 12:05:45 -- common/autotest_common.sh@1695 -- $ export 'LCOV=lcov 00:10:16.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.689 --rc genhtml_branch_coverage=1 00:10:16.689 --rc genhtml_function_coverage=1 00:10:16.689 --rc genhtml_legend=1 00:10:16.689 --rc geninfo_all_blocks=1 00:10:16.689 --rc geninfo_unexecuted_blocks=1 00:10:16.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.689 ' 00:10:16.689 12:05:45 -- common/autotest_common.sh@1695 -- $ LCOV='lcov 00:10:16.689 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:16.689 --rc genhtml_branch_coverage=1 00:10:16.689 --rc genhtml_function_coverage=1 00:10:16.689 --rc genhtml_legend=1 00:10:16.689 --rc geninfo_all_blocks=1 00:10:16.689 --rc geninfo_unexecuted_blocks=1 00:10:16.689 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:16.689 ' 00:10:16.689 12:05:45 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:16.689 12:05:45 -- scripts/common.sh@15 -- $ shopt -s extglob 00:10:16.689 12:05:45 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:16.689 12:05:45 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:16.689 12:05:45 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:16.689 12:05:45 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.689 12:05:45 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.689 12:05:45 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.689 12:05:45 -- paths/export.sh@5 -- $ export PATH 00:10:16.689 12:05:45 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:16.689 12:05:45 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:16.689 12:05:45 -- common/autobuild_common.sh@479 -- $ date +%s 00:10:16.689 12:05:45 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1732705545.XXXXXX 00:10:16.689 12:05:45 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1732705545.0ojlPK 00:10:16.689 12:05:45 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:10:16.689 12:05:45 -- common/autobuild_common.sh@485 -- $ '[' -n v23.11 ']' 00:10:16.947 12:05:45 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:16.947 12:05:45 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:10:16.947 12:05:45 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:16.947 12:05:45 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:16.947 12:05:45 -- common/autobuild_common.sh@495 -- $ get_config_params 00:10:16.947 12:05:45 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:10:16.947 12:05:45 -- common/autotest_common.sh@10 -- $ set +x 00:10:16.947 12:05:45 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:10:16.947 12:05:45 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:10:16.947 12:05:45 -- pm/common@17 -- $ local monitor 00:10:16.947 12:05:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:16.947 12:05:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:16.947 12:05:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:16.947 12:05:45 -- pm/common@21 -- $ date +%s 00:10:16.947 12:05:45 -- pm/common@21 -- $ date +%s 00:10:16.947 12:05:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:16.947 12:05:45 -- pm/common@25 -- $ sleep 1 00:10:16.947 12:05:45 -- pm/common@21 -- $ date +%s 00:10:16.947 12:05:45 -- pm/common@21 -- $ date +%s 00:10:16.947 12:05:45 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732705545 00:10:16.947 12:05:45 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732705545 00:10:16.947 12:05:45 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732705545 00:10:16.947 12:05:45 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732705545 00:10:16.947 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732705545_collect-vmstat.pm.log 00:10:16.947 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732705545_collect-cpu-load.pm.log 00:10:16.947 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732705545_collect-cpu-temp.pm.log 00:10:16.947 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732705545_collect-bmc-pm.bmc.pm.log 00:10:17.883 12:05:46 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:10:17.883 12:05:46 -- spdk/autopackage.sh@10 -- $ [[ 0 -eq 1 ]] 00:10:17.883 12:05:46 -- spdk/autopackage.sh@14 -- $ timing_finish 00:10:17.883 12:05:46 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:17.883 12:05:46 -- common/autotest_common.sh@737 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:17.883 12:05:46 -- common/autotest_common.sh@740 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:17.883 12:05:46 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:10:17.883 12:05:46 -- pm/common@29 -- $ signal_monitor_resources TERM 00:10:17.883 12:05:46 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:10:17.883 12:05:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.883 12:05:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:10:17.883 12:05:46 -- pm/common@44 -- $ pid=1744647 00:10:17.884 12:05:46 -- pm/common@50 -- $ kill -TERM 1744647 00:10:17.884 12:05:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.884 12:05:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:10:17.884 12:05:46 -- pm/common@44 -- $ pid=1744649 00:10:17.884 12:05:46 -- pm/common@50 -- $ kill -TERM 1744649 00:10:17.884 12:05:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.884 12:05:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:10:17.884 12:05:46 -- pm/common@44 -- $ pid=1744651 00:10:17.884 12:05:46 -- pm/common@50 -- $ kill -TERM 1744651 00:10:17.884 12:05:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.884 12:05:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:10:17.884 12:05:46 -- pm/common@44 -- $ pid=1744675 00:10:17.884 12:05:46 -- pm/common@50 -- $ sudo -E kill -TERM 1744675 00:10:17.884 + [[ -n 1581261 ]] 00:10:17.884 + sudo kill 1581261 00:10:17.893 [Pipeline] } 00:10:17.908 [Pipeline] // stage 00:10:17.913 [Pipeline] } 00:10:17.929 [Pipeline] // timeout 00:10:17.936 [Pipeline] } 00:10:17.952 [Pipeline] // catchError 00:10:17.957 [Pipeline] } 00:10:17.974 [Pipeline] // wrap 00:10:17.980 [Pipeline] } 00:10:17.994 [Pipeline] // catchError 00:10:18.005 [Pipeline] stage 00:10:18.007 [Pipeline] { (Epilogue) 00:10:18.021 [Pipeline] catchError 00:10:18.022 [Pipeline] { 00:10:18.036 [Pipeline] echo 00:10:18.039 Cleanup processes 00:10:18.047 [Pipeline] sh 00:10:18.337 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:18.337 1744791 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:10:18.337 1745211 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:18.353 [Pipeline] sh 00:10:18.641 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:18.641 ++ grep -v 'sudo pgrep' 00:10:18.641 ++ awk '{print $1}' 00:10:18.641 + sudo kill -9 1744791 00:10:18.652 [Pipeline] sh 00:10:18.935 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:18.935 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:18.935 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:20.311 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:30.289 [Pipeline] sh 00:10:30.573 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:30.573 Artifacts sizes are good 00:10:30.585 [Pipeline] archiveArtifacts 00:10:30.592 Archiving artifacts 00:10:30.764 [Pipeline] sh 00:10:31.127 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:31.143 [Pipeline] cleanWs 00:10:31.154 [WS-CLEANUP] Deleting project workspace... 00:10:31.154 [WS-CLEANUP] Deferred wipeout is used... 00:10:31.161 [WS-CLEANUP] done 00:10:31.163 [Pipeline] } 00:10:31.182 [Pipeline] // catchError 00:10:31.198 [Pipeline] sh 00:10:31.481 + logger -p user.info -t JENKINS-CI 00:10:31.491 [Pipeline] } 00:10:31.505 [Pipeline] // stage 00:10:31.511 [Pipeline] } 00:10:31.525 [Pipeline] // node 00:10:31.531 [Pipeline] End of Pipeline 00:10:31.577 Finished: SUCCESS